Joseph H. Schuessler Ph.D.

CHAPTER 4

DATA ANALYSIS AND RESULTS

The data analysis and results of the study are presented in this chapter which is organized in eight sections: 1) data collection and response rate, 2) analysis of non-response bias, 3) sample characteristics, 4) threats instrument, 5) countermeasures instrument, 6) information systems security effectiveness instrument, 7) assessments of validity, 8) PLS Analysis, and finally, 9) hypothesis testing.  The first section presents a description of data collection procedures for the survey and a discussion of the response rate to the survey.  a summary of the demographic characteristics of the respondents.  The next section describes the response rate to the survey.  This is followed by a discussion of non-response bias which is in turn followed by various sample characteristics such as industries from which surveys were received, gender of those who responded, and so on.  Next is the discussion of the validity and reliability of the sample.  The chapter concludes with the results of the hypotheses.

Data Collection and Response Rate

To contact IT professionals for the study, the AITP leadership forwarded and email to its 1500 professional members.  The email consisted of a brief description of the survey and the benefits of participating in the survey along with a link to the survey itself which itself contained additional information.  Of the 1500 professional members, 73 completed surveys were received representing a response rate of 4.9%.  While this is a low response rate, the sensitive nature of the subject matter should be taken into account.  Additionally, though the response rate was low, of those who actually clicked on the link to the survey (332), approximately 22% actually completed the survey.  A cursory examination of the demographics of the respondents reveals that a range of organizations, in terms of size, and industries are represented.  Three weeks after the first email, a follow-up email was sent thanking those who had already participated and encouraging those who had not to please do so.  Finally, three weeks after the second email, a link was included in the AIPT’s monthly online newsletter.

Analysis of Non-Response Bias

The purpose of performing non-response analysis is to attempt to identify characteristics that may differ between respondents and non-respondents in order to potentially ferret out any bias that may exist within a dataset.  While directly inquiring from non-respondents as to the reasons for not participating in the study would be ideal, it would be unlikely that such non-participants would respond to further inquiries given their lack of participation in the initial inquiry.  Another method of assessing non-response bias is to compare early responders and late responders to the survey.  Table 7 below displays a means comparisons between early and late responders.  An independent samples t-test was performed against responses to eight demographic variables.  The table illustrates that there are no significant differences at the .05 level of significance between early and late respondents.

 

 

Table 7. Independent Samples Test of Normal v Late Respondents

 

F

Leven’s Test

t-test

Df

Sig (2-tailed)

Mean Difference

Std Error

95 % Confidence Interval

 

Lower

Upper

 

Age

5.063

.037

-.768

18

.452

-.40000

.52068

-1.49392

.69392

 

Gender

5.063

.037

1.406

18

.177

.30000

.21344

-.14842

.74842

 

Industry

.154

.700

-.187

18

.854

-1.400

7.499

-17.155

14.355

 

Organizational Size

.000

1.000

.000

16

1.000

.000

.248

-.527

.527

 

Security Budget

3.115

.095

.000

18

1.000

.000

1.145

-2.406

2.406

 

Employment Length

.157

.697

.567

18

.578

3.200

5.647

-8.663

15.063

 

Level of Education

3.947

.062

1.897

18

.074

.60000

.31623

-.06437

1.26437

 

Organizational Role

.212

.651

-.336

18

.740

-.20000

.59442

-1.44883

1.04883

 

 

Sample Characteristics

Using the industry categories as established by the Small Business Administration, industry classification was gathered from each respondent.  An examination of Table 8 reveals that Finance and Insurance as well as Manufacturing were the most well represented industries.  Several industries failed to be incorporated into the study including Real Estate, Utilities, and Wholesale trade to name a few.

Table 8. Description of Respondents by Industry

 

 

Frequency

%

Cumulative %

Services/Non-Services

Accommodation and Food Services (SBA 72)

0

0

0

S

Administrative and Support, Waste Management and Remediation Services (SBA 56)

0

0

0

S

Agriculture, Forestry, Fishing and Hunting (SBA 11)

0

0

0

NS

Arts, Entertainment and Recreation (SBA 71)

0

0

0

S

Construction (SBA 23)

3

4.1

4.1

NS

Educational Services (SBA 61)

8

11.0

15.1

S

Finance and Insurance (SBA 52)

15

20.5

35.6

S

Health Care and Social Assistance (SBA 62)

3

4.1

39.7

S

Information (SBA 51)

9

12.3

52

S

Management of Companies and Enterprises (SBA 55)

0

0

52

S

Manufacturing (SBA 31-33)

10

13.7

65.7

NS

Mining (SBA 21)

0

0

65.7

NS

Professional, Scientific and technical Services (SBA 54)

6

8.2

73.9

S

Public Administration (SBA 92)

6

8.2

82.1

S

Real Estate and Rental and Leasing (SBA 53)

0

0

82.1

S

Retail Trade (SBA 44-45)

3

4.1

86.2

NS

Transportation (SBA 48-49)

3

4.1

90.3

S

Utilities (SBA 22)

0

0

90.3

S

Wholesales Trade (SBA 42)

0

0

90.3

S

Other Services (SBA 81)

7

9.6

99.9

S

Total

73

 

 

 

* Note: SBA numbers with each industry classification identify the top level industry classification.  These were then further consolidated in the right-most column to indicate each industry as either services or non-services.  Classification as either service or non-service was based on information retrieved from the NAICS web site at http://www.naics.com/info.htm retrieved 2-4-2009.  This was done for later analysis in the PLS analysis.

           

 

In order to efficiently and effectively incorporate industry in the analysis, it was necessary to classify various industries as either being Service oriented or Non-Service oriented.  The SBA uses the North American Industry Classification System (NAICS) to identify and classify various industries.  Essentially, Standard Industrial Classification (SIC) codes were replaced by NAICS codes which standardize industry classifications for the United States, Canada, and Mexico.  Based on a comparison of NAICS sectors and SIC divisions listed on the NAICS web site, NAICS codes were identified as being service oriented.  Those that were not classified as being service oriented were identified as being non-service oriented.

Organizational size as defined by number of employees was bimodal with the majority or organizations having 100 or fewer employees or more than 1500.  However, for purposes of this research, organizational size was defined using the Small Business Administration’s (SBA’s) definition of small and large businesses.  Using the SBA classification scheme to identify a business as either being large or small resulted in 31 small businesses being identified while 36 large businesses were identified.  Six businesses did not fit into either classification as the respondents classified themselves as Public Administration which the SBA specifically does not classify in terms of either large or small.  The results are summarized in Table 9.  Organizational size by annual receipts are also reported in the table and indicate that approximately half of the organizations had annual receipts of less the $32.5 million.

 

Table 9. Organizational Size

# Employees

Frequency

%

Cumulative %

Small Business Administration Classification

 

0-100

26

35.6

35.6

Small

31

 

101-500

13

17.8

53.4

Large

36

 

501-1000

6

8.2

61.6

Unclassified

6

 

1001-1500

3

4.1

65.7

Total

73

 

Greater than 1500

25

34.2

99.9

 

 

Total

73

 

 

 

 

 

 

 

Annual Receipts

 

 

 

Less than or equal to $32.5 million

35

47.9

 

Greater than $32.5 million

38

52.1

 

 

 

 

 

 

Respondents were also asked to identify the percentage of the IS budget which best reflects the amount spent of security.  Table 10 below details the responses.  Of particular interest is that a large percentage reported that security represented less 1 percent of the overall IS budget as well as the  large percentage indicated that they did not know how much of the IS budget was represented by security.  The 19 respondents that indicated that security receives less than 1 percent of the IS budget could be an indication that effective risk management procedures are in place allowing for effective cost controls which limits expenditures.  Conversely, it could be an indication that risk management procedures have not been undertaken and as a result, there is a fundamental lack of understanding as to the exposure that an organization may face.  At the other extreme, the 20 respondents that indicated that they did not know their organization’s security percentage of the IS budget could also indicate contrasting issues.  The fundamental lack of understanding of security issues and risk exposure could also be taking place at these organizations.  However, it is possible that the risk management process has become so intertwined with other IS planning, analysis, development, implementation, and maintenance processes, that delineating what should be distinguished as emanating from the security budget and from the rest of the IS budget could be difficult if not impossible (Young, 2008).

 

Table 10. Security Budget as a Percentage of IS Budget

 

Frequency

%

Cumulative

Less than 1%

19

26.0

26.0

1%-2%

9

12.3

38.4

3%-5%

6

8.2

46.6

6%-7%

2

2.7

49.3

8%-10%

8

11.0

60.3

More than 10%

9

12.3

72.6

Unknown

20

27.4

100.0

 

In order to gain a further understanding of the relationship between organizational size and the percentage of the IS budget represented by security, a cross-tab and chi-square test of independence was conducted.  Because one of the requirements for chi-square is that each of the expected squares must be five or greater, some of the categories for the security budget were combined.  Those that indicated unknown for their security budget were not included in this analysis.  Table 11 below shows the results. The results indicate that there is in fact a relationship between whether or not an organization is large or small and the percentage of there is budget spend on security.  Small organizations tend to spend significantly less of their IS budget on security or significantly more of their IS budget on security than would statistically be anticipated.  Conversely, no small organizations were found to spend the more moderate amount of 1-7% of there IS budget on security.  Larger organizations tended to be more well distributed though there was a tendency for the larger organizations to be more moderate than would statistically be expected.

 

Table 11. Cross-Tab of IS Security Budget and Organizational Size

 

Percentage of IS Budget Spent on Security

Organizational Size

Less than 1%

1%-7%

8% or Greater

Total

Small

 

 

 

 

Observed

11

0

9

20

Expected

6.8

6.0

7.2

20.0

Large

 

 

 

 

Observed

5

14

8

27

Expected

9.2

8.0

9.8

27.0

Total

 

 

 

 

Observed

16

14

17

47

Expected

16.0

14.0

17.0

47.0

 

 

 

 

 

χ2

15.613

 

 

 

p-value

.000

 

 

 

Α

.05

 

 

 

Df

2

 

 

 

 

Additional insight was sought for the relationship between organizational size and industry in order to make certain that various industries were statistically well represented as either being either servicer or non-service.  A cross-tab query was also constructed to test the relationship between these two variables and can be seen in Table 12 below.  The results indicate that there was no relationship between respondents in terms of organizational size and industry classification which serves to strengthen any conclusions that are drawn as they relate to each of these constructs.

Table 12. Cross-Tab of Organizational Size and Industry

 

Industry

Organizational Size

Non-Services

Services

Total

Small

 

 

 

Observed

9

22

31

Expected

9.3

21.7

31.0

Large

 

 

 

Observed

11

25

36

Expected

10.7

25.3

36.0

Total

 

 

 

Observed

20

47

67

Expected

20

47

67.0

 

 

 

 

χ2

.018

 

 

p-value

.892

 

 

Α

.05

 

 

Df

1

 

 

 

 The organizational roles of the respondents were fairly well distributed.  As can be seen in table 12, the vast majority of respondents, 48 out of 73, were responsible for various IS activities at their organization.  Only six identified themselves as Top Management while 19 did not fit neatly into one of the predefined categories. The average organizational tenure for respondents was 17.37 years suggesting considerable organizational experience for most of the respondents.  Table 12 illustrates that most of those sampled have been with their organization longer than four years.

 

Table 13. Organizational Roles

 

Frequency

%

Cumulative

Top Management

6

8.2

8.2

IS Directors

20

27.4

35.6

IS Middle Management

20

27.4

63.0

Security Professionals

8

11.0

74.0

Other

19

26.0

100.00

Total

73

100.00

 

 

Table 14. Respondent Tenure

Years

Frequency

%

Cumulative

1

6

8.2

8.2

2

3

4.1

12.3

4

5

6.8

19.2

8

3

4.1

23.3

9

3

4.1

27.4

10

8

11.0

38.4

11

3

4.1

42.5

12

3

4.1

46.6

13

3

4.1

50.7

15

2

2.7

53.4

17

3

4.1

57.5

19

3

4.1

61.6

20

2

2.7

64.4

24

2

2.7

67.1

25

3

4.1

71.2

27

3

4.1

75.3

30

3

4.1

79.5

33

3

4.1

83.6

34

3

4.1

87.7

35

6

8.2

95.9

38

3

4.1

100.0

Total

73

100.0

 

 

Threats Instrument

The average response the question of ‘to what degree do each threat represent to your organization’ was 4.28.  Measured on a 7-point likert scale, this indicated a slightly greater than neutral response.  Quality of service deviation from service providers represents the greatest threat to those who responded to the survey.  Interestingly, this particular statistic also had the lowest standard deviation indicating more agreement among all of the respondents. 

 

Table 15. Descriptive Statistics for Threats

 

Threat

N

Mean

Rank

Standard Deviation

Quality of service deviations

73

5.52

1

1.492

Technical software failure or errors

73

4.71

2

1.603

Acts of human error of failure

73

4.64

3

1.576

Deliberate software attacks

73

4.59

4

1.847

Accidental destruction of data

73

4.55

5

1.811

Accidental entry of bad data

73

4.42

6

1.885

Failure to follow policies and procedures

73

4.25

7

1.816

Social engineering

73

4.23

8

1.696

Deliberate acts of theft

73

4.16

9

1.856

Deliberate acts of espionage or trespass

73

4.11

10

1.976

Forces of nature

73

4.10

11

2.063

Disgruntled employees

73

4.10

11

1.909

Accidental destruction of hardware

73

3.58

13

2.027

Pandemics

73

2.95

14

1.817

 

Countermeasures Instrument

The average response the question of ‘to what degree does your organization use each countermeasure’ was 4.91.  Measured on a 7-point likert scale, this indicated a slightly greater than neutral response.  While the rankings of some countermeasure techniques such as the use of anti-virus software should come as no surprise, others such as user training and education is somewhat troubling given the numerous studies which discuss not only the effectiveness of user education and training, but it’s relatively low costs as well (Schultz, 2004).

 

Table 16. Descriptive Statistics for Countermeasures

 

Countermeasure

N

Mean

Rank

Standard Deviation

Use of virus protection software

73

6.56

1

.943

Use of firewalls

73

6.30

2

1.198

Media backup

73

6.11

3

1.318

Manage patch/update process

73

5.85

4

1.664

Physical area security

73

5.70

5

1.738

Password policies

73

5.36

6

1.946

Use of rights management software

73

5.33

7

2.000

Use of auto account lock/logoff

73

5.26

8

1.878

Alarm systems

73

5.22

9

1.917

Perform background checks

73

5.21

10

2.267

Consistently apply security policy

73

4.99

11

1.975

Use of internal measures to enforce/protect the organization’s interests

73

4.74

12

2.041

Publish formal standards

73

4.73

13

2.213

Suspicious activity report

73

4.70

14

2.240

Contingency planning

73

4.64

15

1.821

Audits of various system logs

73

4.62

16

1.831

Work with external legal/regulatory agencies to enforce/protect the organization’s interests

73

4.53

17

2.062

Penetration/Vulnerability testing

73

4.34

18

2.168

Use of redundant assets and facilities

73

4.30

19

2.367

Encourage violations reporting

73

4.25

20

2.184

User training/education

73

4.23

21

1.997

Use of cameras

73

3.78

22

2.540

Drill procedures

73

3.60

23

2.271

Warning signs

73

3.49

24

2.187

Information Systems Security Effectiveness Instrument

The final construct examined is the ISS Effectiveness construct.  The means for each item in the construct averaged well above what would be considered a neutral response.  One interesting observation that can be seen by examining Table 15 is that the asset categories defined by Straub (1990) seemed to consistently be scored higher than the dimensions developed from General Deterrence Theory.

Table 17. Descriptive Statistics for Information Systems Security Effectiveness

 

Effectiveness Dimension

N

Mean

Rank

Standard Deviation

Overall deterrent effect

73

4.88

6

1.499

Overall preventive effect

73

5.42

3

1.632

Overall detection effect

73

4.75

7

1.847

Overall remedy effect

73

4.66

8

1.974

Effect in protecting hardware

73

5.73

1

1.004

Effect in protecting software

73

5.48

2

1.029

Effect in protecting computing services

73

5.32

4

1.212

Effect in protecting data

73

5.30

5

1.738

Assessment of Validity

Construct validity is assessed by performing a factor analysis on each item in a survey and calculating the reliability of the resulting factors. Principle component factor analysis using Direct Oblimin with Kaiser normalization rotation method was conducted on the three measurement instruments. The Direct Oblimin rotation was use because according to Hair et al. (1998), oblique rotational methods are preferable when the goal of the factor analysis is to produce meaningful factors as opposed to item reduction.  Table 17 shows the results of the factor analysis for the countermeasure construct as conceptualized using the four dimensions of General Deterrence Theory: Deterrence, Detection, Prevention, and Remedy.  Four factors were identified all with eigenvalues greater than 1.  According to Hair et al. (1998), loadings of .5 or greater represent items of practical significance.  After examining the factor loadings, six items were removed because they failed to reach that level on any factor.  The remaining items did not cross-load with any other items using .5 as the cross loading criteria and as a result, the countermeasure construct exhibits a high degree of discriminant validity.  Using an exploratory factor analysis approach to analyze the associations of each item, each factor was assigned to the appropriate dimension of General Deterrence Theory.  Evidence of convergent validity is demonstrated by factor loadings greater than 0.5 which are highlighted in Table 17.

After the factor analysis, the Cronbach’s alpha of each factor was calculated in order to assess the reliability of each dimension of the countermeasures construct. Cronbach’s alpha measures the internal consistency of the items in the factor.  The lower limit for an acceptable Cronbach’s alpha is 0.7 (Hair et al., 1998) though 0.6 may be acceptable for newly defined scales.  The results are displayed at the bottom of Table 17.  All are well above the 0.7 threshold indicating that there is a high level of internal consistency in each measure.  The total variance explained is 75.43% indicating that these four dimensions account for a significant amount of the variance.

Table 18. Countermeasure Factor Matrix

Item

Deterrence

Prevention

Detection

Remedy

User (re)training/(re)education (Countermeasure 16)

.896

.111

-.072

-.040

Warning signs informing violators of possible protective measures and civil/legal remedies (Countermeasure 6)

.721

.147

-.328

.137

Penetration/Vulnerability testing (Countermeasure 3)

.656

-.288

.001

.264

Use of auto account lock/logoff (Countermeasure 14)

.519

-.439

.057

.129

Suspicious activity reports generated from intrusion detection systems (Countermeasure 1)

.490

-.392

.118

.237

Use of firewalls (Countermeasure 17)

-.134

-.962

-.030

-.110

Use of virus protection software (Countermeasure 22)

-.163

-.907

.100

.040

Media backup (Countermeasure 18)

.149

-.891

.012

-.016

Manage patch and update procedures (Countermeasure 12)

.042

-.835

-.201

.043

Consistently apply organization's security policy (Countermeasure 8)

.366

-.498

-.069

.237

Plan for various contingencies (Countermeasure 9)

.126

-.489

-.396

.298

Password policies that enforce strength and frequency of change (Countermeasure 19)

.194

-.452

-.404

.002

Use of cameras to demonstrate monitoring of sensitive areas (Countermeasure 7)

-.221

.072

-.825

.231

Use of redundant assets and facilities (Countermeasure 13)

-.131

-.090

-.800

.222

Encourage violations reporting (Countermeasure 5)

.280

.079

-.772

.020

Drill to make sure contingency plans are effective (Countermeasure 10)

.345

.039

-.644

.201

Physical area security (Countermeasure 20)

.272

-.336

-.594

-.295

Alarm systems to protect from intrusion and fire (Countermeasure 2)

.388

-.137

-.571

-.149

Use of rights management to control access to workstation/network resources (Countermeasure 15)

.101

-.417

-.564

-.368

Audit of various system logs (Countermeasure 4)

.347

-.295

-.382

.234

Publish formal standards (Countermeasure 21)

.253

-.308

-.321

.248

Work with external legal and regulatory entities to enforce and protect the organization's interests (Countermeasure 23)

.336

.017

.036

.816

Perform background checks as condition of employment or promotion (Countermeasure 11)

-.257

-.145

-.264

.745

Use internal measures such as verbal warnings, reprimands, and termination to enforce and protect the organization's interests (Countermeasure 24)

.243

.074

-.126

.630

Eigenvalue

12.233

2.723

1.688

1.459

Variance Explained

50.972

11.345

7.033

6.079

Cronbach’s Alpha

.870

.923

.903

.784

Note: Highlighted blocks represent the items for each construct used in the PLS analysis.  Items not highlighted under any factor did not load sufficiently and as such was not used in the PLS analysis.

 

A principle component factor analysis using Direct Oblimin with Kaiser normalization rotation method was also conducted on the threats construct.  Table 18 shows the results of the factor analysis for the threats.  The current research conceptualizes threats as uni-dimensional for ease of analysis however it should be noted that the results of the factor analysis identified four distinct factors all with eigenvalues greater than 1.  Again, adhering to Hair’s recommendation of loadings of .5 or greater to represent items with practical significance all but a single item loaded on a single factor.  Social Engineering cross-loaded on factors one and three and as a result was removed from the analysis.  The remaining items did not cross-load with any other items using .5 as the cross loading criteria and as a result, the threats construct exhibits a high degree of discriminant validity.  Because threats are treated uni-dimensionally in the current research, there was no attempt at identifying the unique characteristics of each dimension of the threat construct.  Evidence of convergent validity is demonstrated by factor loadings greater than 0.5 which are highlighted in Table 18.

Due to the theoretical development of the research model, an additional factor analysis was conducted, this time constraining the number of factors to one.  This was done in order to ease the analysis in the PLS model.  Again using Hair’s .5 criteria, accidental entry of bad data, forces of nature, social engineering, and quality of service deviation were eliminated from the list of threats.  As a result, each highlighted threat in the “Constrained” column in Table 18 was used in order to conduct the PLS analysis.

After the factor analysis was completed, the Cronbach’s alpha of each factor was calculated in order to assess the reliability of each dimension of the threats construct. The results are displayed at the bottom of Table 18.  All are well above the 0.7 threshold indicating that there is a high level of internal consistency in each measure.  The constrained items had a Cronbach’s alpha of .889.  The total variance explained is 40.149% indicating that the constrained analysis still accounts for a significant amount of the variance in the model.

Table 19. Threats Factor Matrix

 

Item

Factor 1

Factor 2

Factor 3

Factor 4

Constrained

Accidental destruction of data (3)

.818

.051

-.203

.241

.699

Accidental entry of bad data (1)

.759

.205

.039

-.209

.463

Accidental destruction of hardware (2)

.586

.178

-.324

.090

.718

Social Engineering (10)

-.578

.413

-.539

.096

.437

Quality of service deviations from service providers such as electricity, Internet, and so on. (14)

.039

.780

.348

.228

.384

Acts of human error or failure (7)

.035

.701

-.075

.041

.598

Disgruntled employees (8)

.113

.694

-.031

.349

.689

Failure to follow policies and procedures (4)

-.037

.675

-.459

-.232

.759

Technical software failures or errors (5)

.424

.637

-.130

-.133

.752

Deliberate acts of espionage or trespass (unauthorized access and/or data collection) (12)

.018

.239

-.861

-.118

.815

Deliberate software attacks (11)

.193

-.236

-.819

-.026

.557

Pandemics (9)

-.015

.119

-.689

.350

.712

Deliberate acts of theft (13)

.477

-.067

-.619

.074

.706

Forces of nature (6)

-.011

.083

-.026

.907

.335

Eigenvalue

5.621

2.199

1.601

1.003

5.621

Variance Explained

40.15

15.71

11.44

7.17

40.149

Cronbach’s Alpha

.803

.772

.851

1.00

.889

           

 

Lastly, a principle component factor analysis using Direct Oblimin with Kaiser normalization rotation method was conducted on the Information Systems Security Effectiveness construct as displayed in Table 19.  Similarly to the threats construct, ISSE is implemented as a uni-dimensional construct for ease of analysis.  However it should be noted that the results of the factor analysis identified two distinct factors all with eigenvalues greater than 1.  Though the factor analysis does not break neatly across theoretical dimensions as developed by Kankanhalli et al. (2003), it is close with a cross loading of only a single item.

As discussed above, due to the theoretical development of the research model, an additional factor analysis was conducted, this time constraining the number of factors to one.  This for done in order to ease the analysis in the PLS model.  Again using Hair’s .5 criteria, all items loaded on the single factor.  The item loadings are detailed in Table 19.

After the factor analysis for the ISSE construct, the Cronbach’s alpha was calculated in order to assess reliability. The results are displayed at the bottom of Table 19.  At .888, the ISSE reliability is well above the 0.7 threshold indicating that there is a high level of internal consistency in the measure.  The total variance explained is 59.94% indicating that as a single factor there is still a large amount of variance explained by the model.

 

 

Table 20. Information Systems Security Effectiveness Factor Matrix

Item

Factor 1

Factor 2

Constrained

Overall detection effect (3)

.932

.057

.771

Overall remedy effect (4)

.781

.103

.598

Overall preventive effect (2)

.778

-.135

.799

Overall deterrent effect (1)

.718

-.079

.699

Effect in protecting data (8)

.550

-.434

.854

Effect in protecting hardware (5)

-.128

-1.005

.746

Effect in protecting software (6)

.048

-.911

.821

Effect in protecting computing services (7)

.186

-.827

.870

Eigenvalue

4.80

1.25

4.795

Variance Explained

59.94

15.59

59.94

Cronbach’s Alpha

.861

.927

.888

 

PLS Analysis

PLS was used to analyze and assess the proposed research model and test the hypotheses set forth earlier.  PLS is has several advantages over traditional statistical techniques.  Similarly to other structural techniques, PLS is able to concurrently test the measurement and structural models.  Additionally, PLS is not constrained to data sets that meet homogeneity and normality requirements (Chin et al., 2003).  PLS also has the advantage in that it can handle smaller sample sizes relative to other structural techniques.  However, PLS is limited with respect to its ability to measure non-recursive relationships.  As recommended by Chin (2008), a recursive version of the model was run.  The recursive model lacked the proposed relationships from each countermeasure back to the threats construct.  Using SmartPLS version 2.0 (Ringle, Wende & Will, 2005), the modified model was analyzed to assess the measurement model and the structural path between the constructs. In order to obtain reliable results and t-values, 200 random samples of 100 were generated using a bootstrap procedure. Then, again following Chin’s suggestion, the non-recursive aspects of the model were assessed by taking the construct scores obtained from the SmartPLS output and importing them into SPSS for analysis.  A two-stage least squares analysis was conducted in order to determined the R2 value for the threats construct as well as the path coefficients between each countermeasure and the threats construct.  Finally, the hypotheses were evaluated by assessing the sign and significance of the structural path coefficients using one-tailed t-test and two-tailed t-test statistics where appropriate.  SmartPLS does not calculate any goodness-of-fit values.  Rather, R2 values were evaluated to assess the ability of various proposed relationships to predict a significant degree of explanatory power in each construct and t-values were assessed to determine the strength of the various paths.  Figure 3 below illustrates the results discussed above.  Table 20 below that summarizes the results of each hypothesis.

 

Figure 3. Structural Model with Standardized Path Coefficients

 

Hypothesis Testing

Table 21: Results of Hypothesis Testing

 

 

 

 

Hypothesis

Stdzd. path coefficient

t-Value

P

α = .05

H1a: Organizational Size will be positively associated with Deterrence.

-0.226

3.527

0.005

*

H1b: Organizational Size will be positively associated with Prevention.

-0.142

1.667

0.070

X

H1c: Organizational Size will be positively associated with Detection.

-0.106

2.553

0.019

*

H1d: Organizational Size will be positively associated with Remedy.

-0.308

4.285

0.002

*

H2a: Industry Affiliation will be related to Deterrence Efforts.

0.037

0.430

0.340

X

H2b: Industry Affiliation will be related to Prevention Efforts.

-0.284

3.064

0.009

H2c: Industry Affiliation will be related to Detection Efforts.

-0.164

1.760

0.061

X

H2d: Industry Affiliation will be related to Remedy Efforts.

-0.059

0.756

0.237

X

H3: Threats will be positively associated with Organizational Size

0.244

4.010

0.003

H4: Threats will be related to Industry Affiliation

0.073

0.745

0.240

X

H5a: Deterrence will be positively associated with ISS Effectiveness.

0.322

3.460

0.005

H5b: Prevention will be positively associated with ISS Effectiveness.

0.643

8.622

0.000

H5c: Detection will be positively associated with ISS Effectiveness.

-0.099

0.990

0.178

X

H5d: Remedy will be positively associated with ISS Effectiveness.

0.119

2.185

0.033

H6a: Threats will be positively associated with Deterrence.

0.488

7.023

0.000

H6b: Threats will be positively associated with Prevention.

0.484

8.820

0.000

H6c: Threats will be positively associated with Detection.

0.391

3.697

0.004

H6d: Threats will be positively associated with Remedy.

0.660

14.150

0.000

H7a: Deterrence will be related to Threats.

0.052

0.368

0.714

X

H7b: Prevention will be related to Threats.

0.280

2.381

0.020

H7c: Detection will be related to Threats.

-0.065

-0.465

0.644

X

H7d: Remedy will be related to Threats.

0.484

4.135

0.000

H8: Organizational Size will be positively associated with ISS Effectiveness

0.107

1.238

0.128

X

H9: Industry Affiliation will be related to ISS Effectiveness.

0.160

2.481

0.021

Legend:

X = Not Supported

 

* = Significant in Opposite direction theorized

 

√ = Supported

 

 

               

Home ] Up ]