6
12th ANNUAL SYMPOSIUM ON INFORMATION ASSURANCE (ASIA '17), JUNE 7-8, 2017, ALBANY, NY Expansive Learning in Cyber Defense: Transformation of Organizational Information Security Culture Shuyuan Mary Ho School of Information Florida State University Tallahassee, FL 32306-2100 [email protected] Alison von Eberstein School of Information Florida State University Tallahassee, FL 32306-2100 [email protected] Christy Chatmon School of Information Florida State University Tallahassee, FL 32306-2100 [email protected] Abstract—Cyber threats to an organization’s network infrastructure are most often the result of system vulnerabilities and misconfiguration. Systems engineers are constantly patching system holes and hardening networks, but their efforts are deemed futile in the eyes of hackers. This study adopts the lens of Activity Theory (AT) to examine the interactivity between Systems Engineers and Penetration Testers. The research aims to uncover conflicts that exist during interactions between cyber offense and defense. A simulation of learning-based conflict in the Cyber Security Virtual Laboratory at Florida State University campus was conducted during Spring 2017. We argue that this expansive learning approach increases the effectiveness of protecting the organization. Concepts extrapolated from data collected allow affirmation of the fact that organizational information security culture can be transformed through the interactions of Penetration Testers and Systems Engineers. The study contributes insights on cyber defense effectiveness for the professional cybersecurity community. Keywords—activity theory; cyber defense; organizational transformation I. INTRODUCTION The pervasiveness of computing technology has an increasing impact on our personal and business lives. Organizations are able to facilitate fertile synchronous and asynchronous communication with their customers, partners, suppliers—and even government agencies—by gathering and sharing information and data in cyberspace. On the one hand, these innovative connections foster the growth of business. On the other hand, information assets hosted and delivered by these technologies face tremendous threats. Malicious hackers continuously exploit the integrity and confidentiality of networked and online information. In the global Verizon [19] Data Breach Investigations Report, participating cybersecurity managers and practitioners revealed that the number of breaches targeting organizational data remains unchecked, and that attack methodologies are becoming more elaborate. Web application attacks became more severe in 2015, while intrusions on the point-of-sale devices also increased significantly. However, denial-of- service attacks and threats by privileged users misusing their privileges both dropped significantly in 2015 (p. 22-23). Although cyber espionage may not occur as frequently, 90% of cyber espionage breaches are designed to capture trade secrets and proprietary information. Apparently, attackers’ ability to compromise victims’ systems and networks has been greatly enhanced in terms of penetration speed. As the rate of these cyber attacks continues to escalate, the danger to organizations’ brand reputation and revenue presents tremendous challenges. Systems engineers and IT professionals maintain system functionality by patching vulnerabilities and updating systems. They face significant obstacles in performing these duties due to the persistent presence of hackers (i.e., cyber attackers). Hackers, in general, may have a better grasp of systems knowledge, understand system vulnerabilities and backdoors, and can quickly perform loophole analysis. This suggests that as more successful incidents of cyberattacks occur, most systems engineers either have not configured their systems properly, or do not have sufficient understanding of the relevant security loopholes and vulnerabilities. Organizations need a clear understanding of secure configurations and countermeasures so that secure software and hardware can be deployed to prevent cyber attacks. Willison and Warkentin [21] investigated violations against corporate security policy. They suggested that such violations are largely due to non-malicious noncompliance, poor employee awareness training, low motivation and commitment, or weak oversight from management. Deterrence approaches have been recommended to reduce employee abuse. Siponen and Vance [16] offered guidelines for providing relevant contextual information in the construction of field surveys when studying information security policy violations. D'Arcy, Hovav et al. [6] studied insider threat and its impact on system misuse through the lens of the deterrence approach. These authors suggested that users’ perceptions of sanctions vary based on individual levels of morality. Vance, Lowry et al. [18] provided an alternative approach to increase awareness and accountability, and to reduce insider abuse and policy violations. These novel system mechanisms included identifiability, awareness of logging, awareness of audit and electronic presence. Regardless of these investigations into prevention- and detection-based approaches for reducing 12th ANNUAL SYMPOSIUM ON INFORMATION ASSURANCE (ASIA '17), JUNE 7-8, 2017, ALBANY, NY ASIA '17 23

Expansive Learning in Cyber Defense: Transformation of ... Learning in Cyber Defense: Transformation of Organizational Information Security Culture Shuyuan Mary Ho School of Information

  • Upload
    doanque

  • View
    231

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Expansive Learning in Cyber Defense: Transformation of ... Learning in Cyber Defense: Transformation of Organizational Information Security Culture Shuyuan Mary Ho School of Information

12th ANNUAL SYMPOSIUM ON INFORMATION ASSURANCE (ASIA '17), JUNE 7-8, 2017, ALBANY, NY

ASIA '17 23

Expansive Learning in Cyber Defense: Transformation of Organizational Information

Security Culture

Shuyuan Mary Ho School of Information

Florida State University Tallahassee, FL 32306-2100

[email protected]

Alison von Eberstein School of Information

Florida State University Tallahassee, FL 32306-2100

[email protected]

Christy Chatmon School of Information

Florida State University Tallahassee, FL 32306-2100

[email protected]

Abstract—Cyber threats to an organization’s network infrastructure are most often the result of system vulnerabilities and misconfiguration. Systems engineers are constantly patching system holes and hardening networks, but their efforts are deemed futile in the eyes of hackers. This study adopts the lens of Activity Theory (AT) to examine the interactivity between Systems Engineers and Penetration Testers. The research aims to uncover conflicts that exist during interactions between cyber offense and defense. A simulation of learning-based conflict in the Cyber Security Virtual Laboratory at Florida State University campus was conducted during Spring 2017. We argue that this expansive learning approach increases the effectiveness of protecting the organization. Concepts extrapolated from data collected allow affirmation of the fact that organizational information security culture can be transformed through the interactions of Penetration Testers and Systems Engineers. The study contributes insights on cyber defense effectiveness for the professional cybersecurity community.

Keywords—activity theory; cyber defense; organizational transformation

I. INTRODUCTION

The pervasiveness of computing technology has an increasing impact on our personal and business lives. Organizations are able to facilitate fertile synchronous and asynchronous communication with their customers, partners, suppliers—and even government agencies—by gathering and sharing information and data in cyberspace. On the one hand, these innovative connections foster the growth of business. On the other hand, information assets hosted and delivered by these technologies face tremendous threats. Malicious hackers continuously exploit the integrity and confidentiality of networked and online information. In the global Verizon [19] Data Breach Investigations Report, participating cybersecurity managers and practitioners revealed that the number of breaches targeting organizational data remains unchecked, and that attack methodologies are becoming more elaborate. Web application attacks became more severe in 2015, while intrusions on the point-of-sale devices also increased significantly. However, denial-of- service attacks and threats by privileged users misusing their privileges both dropped significantly in 2015 (p. 22-23).

Although cyber espionage may not occur as frequently, 90% of cyber espionage breaches are designed to capture trade secrets and proprietary information. Apparently, attackers’ ability to compromise victims’ systems and networks has been greatly enhanced in terms of penetration speed. As the rate of these cyber attacks continues to escalate, the danger to organizations’ brand reputation and revenue presents tremendous challenges.

Systems engineers and IT professionals maintain system functionality by patching vulnerabilities and updating systems. They face significant obstacles in performing these duties due to the persistent presence of hackers (i.e., cyber attackers). Hackers, in general, may have a better grasp of systems knowledge, understand system vulnerabilities and backdoors, and can quickly perform loophole analysis. This suggests that as more successful incidents of cyberattacks occur, most systems engineers either have not configured their systems properly, or do not have sufficient understanding of the relevant security loopholes and vulnerabilities. Organizations need a clear understanding of secure configurations and countermeasures so that secure software and hardware can be deployed to prevent cyber attacks. Willison and Warkentin [21] investigated violations against corporate security policy. They suggested that such violations are largely due to non-malicious noncompliance, poor employee awareness training, low motivation and commitment, or weak oversight from management. Deterrence approaches have been recommended to reduce employee abuse. Siponen and Vance [16] offered guidelines for providing relevant contextual information in the construction of field surveys when studying information security policy violations. D'Arcy, Hovav et al. [6] studied insider threat and its impact on system misuse through the lens of the deterrence approach. These authors suggested that users’ perceptions of sanctions vary based on individual levels of morality. Vance, Lowry et al. [18] provided an alternative approach to increase awareness and accountability, and to reduce insider abuse and policy violations. These novel system mechanisms included identifiability, awareness of logging, awareness of audit and electronic presence. Regardless of these investigations into prevention- and detection-based approaches for reducing

12th ANNUAL SYMPOSIUM ON INFORMATION ASSURANCE (ASIA '17), JUNE 7-8, 2017, ALBANY, NY

ASIA '17 23

Page 2: Expansive Learning in Cyber Defense: Transformation of ... Learning in Cyber Defense: Transformation of Organizational Information Security Culture Shuyuan Mary Ho School of Information

12th ANNUAL SYMPOSIUM ON INFORMATION ASSURANCE (ASIA '17), JUNE 7-8, 2017, ALBANY, NY

ASIA '17 24

computer incidents and violations, the information systems and networks, as interacting objects between cyber attackers and defenders, remain vulnerable. Consequently, it is imperative to study and understand the critical relationships that exist between penetration testers and defenders in order to improve technical problem-solving, incident handling, and systems protection.

Cyber infrastructure alludes to a virtual “space” existing in the networks and systems where interaction occurs between cyber defenders and attackers. In this space, there are roles for the cyber defenders (i.e., IT professionals or systems engineers) as well as roles for the attackers (i.e., white-hat hackers or penetration testers). Theoretically and operationally, both roles examine and assess system vulnerabilities through testing that emulates a real-life cyber attack. However, their motivations are different. Likewise, dissimilarities in the roles of cyber attackers and cyber defenders yield variations in goals and objectives.

The objectives of this study are to identify factors influencing the interactions and differences between cyber offense and defense, and to glean insights that will have practical implications for organizational information security best practices. To this end, we intend to investigate the following research questions: How do cyber defense and offense teams interact in ways that can transform an organization’s information security?

We adopt the lens of Activity Theory (AT) [8] to study interactions and activities for capturing the differences in motivation and behavior between cyber defenders (i.e., Systems Engineers) and cyber attackers (i.e., Penetration Testers). More specifically, we analyze the sociological and technological factors that characterize this interactivity. This can be achieved by: (1) comparing the goals, actions and operations of both groups, and (2) studying how the social environment shapes the values espoused by each professional group in order to identify the conflicting values. This paper will first discuss the theoretical frameworks and extant literature, then describe the research design, methodology, and data collection. This article will conclude with research contributions.

II. THEORETICAL FRAMEWORK

With respect to Activity Theory (AT), the unit of analysis is ‘motivated activity directed at an object (goal).’ Activity pertains to subject-object interaction wherein the subject is considered an active entity (i.e., a cyber defender, or a cyber offender) whose motives transform the object [10]. Specifically, the object represents something that motivates the subject, or something that meets his or her needs. That is, an object is associated with a subject’s motive. Overall, AT embodies a concept in which activity is “a unit of subject- object interaction defined by the subject’s motive. It is a system of processes oriented toward the motive, where the meaning of any individual component of the system is determined by its role in attaining the motive” [10, p. 60].

The “triangle framework” defined by Engeström [9] has been widely used to study system and technologically meditated activities [10]. This framework consists of subject,

object, instrument, rule, community, division of labor, and outcome. In an activity system, a subject refers to a social actor engaging in activities, and an object refers to the objective of the activity system. A community provides the social context for the subject (i.e., actor), which is an integral part of the activity system [7]. When a subject (e.g., a cyber defender) operates in and interacts with a community (e.g., a cyber defense team), the subject is aware of the rules (e.g., team’s policies and social norms) and realizes the division of labor (i.e., tasks assigned to each other) in the community. Using mediating tools (i.e., system, software or any technology), a subject will attempt to achieve certain objectives (i.e., to secure a system or a corporate network per se) prescribed by the community. Eventually, the outcome is that the entire activity system is transformed (e.g., systems become more secure to withstand cyberattack). AT is a proper framework to discover a subject’s motive by studying a subject’s actions, operations, and choices of tools within a social environment.

A. OriginalityActivity Theory (AT) originated during the late 70s and

early 80s in the work of the Soviet developmental psychologist Leontiev [14]. AT was influenced by and a response to the pioneering work of Leontiev’s collaborator and professor, Vygotsky [20]. Through cultural-historical activity theory, Vygotsky [20] explained how humans reason by examining their activities while engaged in a social environment. The AT framework has been considered more of a descriptive meta-theory than a predictive approach to providing insight into human activity. In essence, AT posits that in the context of a social environment, an object becomes a target of human activity that is mediated by artifacts or instruments [20]. For example, technology becomes a mediating tool between database administrators (subject) and database security protection (object). AT was adopted in the Human Computer Interaction (HCI) research by Bødker [2] to study the human use of technology wherein technology played a mediating role between a human and the world. In the 80s, usability engineering became a popular concept in human-computer interaction. Usability engineering entails three key notions: 1) iterative development to prototype and to align with user needs and experience, 2) contextual design to broaden the scope of design through “low-tech” cooperative activities and “conversation analysis,” and 3) cost effectiveness to carry out more prototyping, evaluation and redesign [4].

In the early 90s, more studies adopted AT in HCI research [1, 3, 11, 15]. Norman [15] differentiated between two views of an artifact. For example, from the systems’ point of view, a checklist can enhance human memory and performance. However, from the individual’s personal perspective, a checklist enables a human agency to change and plan the task because a human would be able to construct, read, interpret, and modify the to-do items based on the inputs to the list. Bannon [1] argued that man as an intelligent being can be likened to a general-purpose computer, and suggested that human factors should properly be factored into the design and development of a computer

12th ANNUAL SYMPOSIUM ON INFORMATION ASSURANCE (ASIA '17), JUNE 7-8, 2017, ALBANY, NY

ASIA '17 24

Page 3: Expansive Learning in Cyber Defense: Transformation of ... Learning in Cyber Defense: Transformation of Organizational Information Security Culture Shuyuan Mary Ho School of Information

12th ANNUAL SYMPOSIUM ON INFORMATION ASSURANCE (ASIA '17), JUNE 7-8, 2017, ALBANY, NY

ASIA '17 25

system. Carroll [4] elucidated the importance of usability or “user-centered design” of computer systems, and emphasized that the engineering scope of HCI will continue to broaden beyond mere user interface interactions. A more integrated system that includes more interactions between human activities and technology will characterize the future of design. Kuutti [12] suggested that as more information processing psychology will influence HCI system design, human actions should be considered as a unit of analysis in the design process. An HCI design will include more cognitive relationships between a human actor’s activities, artifacts of the computer, the tasks and context. A user’s motive will interact with his goal and condition of use. The user’s activity will entail longer-term formations that will encompass his actions and operations. The iterative design of HCI systems will include groups (rather than individuals), a more practitioner-oriented workplace (rather than a laboratory), experts (rather than novices), and design (rather than analysis). Moreover, user-centered design will be replaced by user-involved design. There will be increased focus on more iterative design rather than user requirements specification. AT conceives of and incorporates individuals as active entities within a system who maintain a relationship with the society [13]. AT focuses on the interactions among subjects (i.e., actors) to identify the salient concepts (e.g., to protect systems) and to suggest mechanisms of certain occurrences (e.g., explaining how and why actors behave in certain ways) [7, 10].

Researchers later began to incorporate perspectives of AT into information systems research. AT is relevant in IS research because “IS research should be able to deal with active individuals, societal change, and multi-disciplinarity” [13, p. 371]. For example, due to the lack of consistent data standards and structure for emergency management fire- related event practice, Chen, Sharman et al. [5] proposed and validated a data model based on AT that informs emergency response system design. In this context, an IS study of fire emergency used AT to design a data model that minimized information interoperability barriers in standardized procedures by studying the interrelationships among different entities [5].

B. ConstructsEngeström [8] offered an interesting conceptualization of

an interactive space, which is also considered the unit of analysis, created by a minimum of two activity systems interacting to attain the overlapping object. This third generation of AT provides a practical lens through which to view the interactions and contradictions (e.g., tensions) within and between activity systems, conflicts in activities, tools that facilitate the overall objectives between two or more communities. These interactive systems consist of dialogue, different perspectives, and networks of interacting activity that subsequently construct a shared object [8]. Engeström [8] devised a healthcare study where patients and the healthcare system were conceived as two separate, interacting systems. The objective of patients was to heal (object) and the objective of the healthcare system was to provide healthcare services (object) [8]. During the

interaction between these two systems, patients provided information about their illnesses in the context of recovery (object2) and the healthcare system categorized patients’ diseases to identify accurate diagnoses (object2). Finally, both patient and healthcare system contributed to the information that produced a working diagnosis (object3— shared object) [8], amidst a potential constellation of interacting problems and evolving diagnoses. In IS research, Valencha, Lee et al. [17] also conducted a healthcare study where the viewpoint of the third generation of AT was adopted to design an access control model (shared object) in transitive health workflows. The researchers examined the interactions among healthcare members, providers, purchasers, and payers.

Drawing from these studies, we posit that cyber defenders (i.e., Systems Engineers) and cyber attackers (i.e., Penetration Testers) are subjects in separate activity systems. The objective of the Systems Engineers (to assess and mitigate risk while enhancing system security) overlaps and further produces conflicts in values when confronted by the objective of the Penetration Testers (to discover system vulnerabilities and flaws using various hacking techniques). Although both subjects in their respective activity systems pursue conflicting goals, collectively they seek to attain a collaboratively constructed objective, which is to protect and improve the overall security posture of an organization.

Engeström [8] further proposed five principles to examine the motives, tools used, actions and operations, group dynamics within and between communities, rules and social norms within each community, as well as the division of labor. First, the unit of analysis is activity, which consists of actions and operations. Second, a division of labor generates different positions and multiple views among the subjects. Innovation and conflict both result from this multi- voicedness. Third, every activity system has its history and past actions that imply a pattern in decisions and actions. Fourth, there are structural tensions, driven by contradictions, in every activity system. Thus, both conflict and innovation emerge from structural change. Fifth, as contradictions intensify, a collaborative, deliberate change will also evolve to produce positive transformation for the organization. Expansive cycles of change, initiated by the subject’s efforts to avoid contradiction and to initiate collaboration, will generate transformations in culture, as well as organizational structure and configuration.

III. STUDY FRAMEWORK

AT is a suitable framework for this study mainly because the patterns of interaction between cyber attackers and cyber defenders can be revealed using the concept of activities. In other words, by examining activities that have occurred in a social environment, AT enables us to address our research objective, and that is to investigate the interactive activities that explain the differences in job behaviors between Penetration Testers and Systems Engineers.

12th ANNUAL SYMPOSIUM ON INFORMATION ASSURANCE (ASIA '17), JUNE 7-8, 2017, ALBANY, NY

ASIA '17 25

Page 4: Expansive Learning in Cyber Defense: Transformation of ... Learning in Cyber Defense: Transformation of Organizational Information Security Culture Shuyuan Mary Ho School of Information

12th ANNUAL SYMPOSIUM ON INFORMATION ASSURANCE (ASIA '17), JUNE 7-8, 2017, ALBANY, NY

ASIA '17 26

Fig. 1. Study framework.

TABLE 1. A SAMPLE OF INTERACTIONS BETWEEN CYBER DEFENDERS ANDPENETRATION TESTERS TO PROTECT AND MAINTAIN THE SECURITY OF A

WEB SERVER AND SUPPORTING NETWORK INFRASTRUCTURES.

The conceptual framework used to analyze the data collected from the interviews was based on the matrix for expansive learning [7] depicted in TABLE 1. Engeström’s [7] model portrays a combination of the five basic learning principles: activity systems as the unit of analysis, multivoicedness, historicity of activity, contradictions and expansive cycles. These principles concern four fundamental elements: (1) the subjects, (2) their motivation, (3) their objectives, and (4) the mechanisms of their learning. This model provides a lens for analyzing collaboration between activity systems (i.e., cyber defender’s and offender’s teams) and community (e.g., society), and to understand the range of complex factors related to the use of mediating tools in order to achieve their collective goal (i.e., to protect and maintain secure networks and systems). The learning challenge in this setting is to transform and create a new culture in which cyber defense teams collaboratively work together to improve the security posture of an organization.

IV. RESEARCH DESIGN

We replicated the servers, workstations, and network environment in four distinct segments, enabled by the virtual switch in the virtual lab, and operated by Hyper-V management. Four (4) groups of systems engineers and Penetration Testers were paired up to configure their own systems and network environment. Each group had four to five members who work separately to familiarize themselves with their systems, and then explore opportunities to penetrate the other teams’ servers and networks. First, the Systems Engineers set up their own Web servers, workstations, firewall (i.e., pfSense), intrusion detection

systems (i.e., HoneyBot and Security Onion) and network environment. Then, each team sets up their own penetration tools (i.e., Kali Linux), and covertly works to penetrate other groups’ systems and networks. They do reconnaissance (i.e., using Zenmap, Wireshark, keyloggers, and reading event logs, etc.) to scan other groups’ networks and systems, and then make attempts at penetration.

A. Laboratory ExperimentThe research design involved a series of cybersecurity

laboratory experiments, as illustrated in Fig. 1. The laboratory included Web servers, workstations and networks that were configured and protected by participants acting in the role of Systems Engineers. Meanwhile, these systems were exploited by other teams acting in the role of Penetration Testers. Virtual machines created a “sandbox” environment which disconnected the Web servers from the Internet. Participants were required to log onto the “sandbox” to participate. This step ensured that the experiment was carried out in a separate and secure environment, so as not to affect the business operations of the University.

Fig. 2. Cyber exercise simulation.

V. METHODOLOGY

A triangulation mixed methods approach is being progressively employed to generate a sociotechnical description of how teams interact, whether collaborative, efficacious, or not. This approach has involved the simultaneous collection of both quantitative data, in the form of electronic survey questionnaires, and qualitative data, in which two semi-structured interviews are conducted for each participant. Moreover, each team’s network communication data (e.g., network packet information) is captured in the virtual simulation lab. These cybersecurity exercises span four months. In the current phase, we have employed a semi- structured interview instrument to collect data. Researcher observations of participants have been conducted to capture the complexities of both teams’ behaviors, as well as the meaning of their behaviors and interactions.

12th ANNUAL SYMPOSIUM ON INFORMATION ASSURANCE (ASIA '17), JUNE 7-8, 2017, ALBANY, NY

ASIA '17 26

Page 5: Expansive Learning in Cyber Defense: Transformation of ... Learning in Cyber Defense: Transformation of Organizational Information Security Culture Shuyuan Mary Ho School of Information

12th ANNUAL SYMPOSIUM ON INFORMATION ASSURANCE (ASIA '17), JUNE 7-8, 2017, ALBANY, NY

ASIA '17 27

A. Data Collection1) Subject selection: We recruited participants enrolled

in an Advanced Information Security undergraduate course (all had acquired the prerequisite skills for the study in the Introduction to Information Security course offered in the prior Fall semester) at Florida State University 1 from January through April 2017. Eighteen (18) student participants (seventeen males and one female) were grouped into four teams of four to five members each. Participants had sufficient technical skills and knowledge to yield meaningful results. Participants were well versed in several technical areas, e.g., troubleshooting, intrusion detection, vulnerability assessment, and application security. Some of the tasks participants performed included “protecting” their systems and networks, “fixing” possible vulnerabilities, “guessing” an attacker’s next move, and “thwarting” activities to “deter” possible future cyberattacks. Accordingly, their viewpoints on these activities shed light on cyber defense.

2) Interviews: Participants completed face-to-face, semi- structured interviews—first as a Systems Administrator, then as a Penetration Tester. The interviews were audio recorded to provide exact transcriptions. Each interview was conducted by the same member of the research team in an office on campus using a prepared script that accomodated open-ended responses. At the outset of each interview session, which lasted approximately 60 minutes, the interviewer reviewed the consent form with the participant, including the purpose, the benefits and risks involved in study participation. The interview solicited information about: the division of labor, the context of the work and assignments, tools used, use conditions, the activities involved in setting up the network/information system, as well as activities involved in protecting and penetrating information assets. The interview also extracted the rules adopted in each of the teams as communities. Interviews were transcribed and coded using a prescribed study code book. A second member of the research team independently coded a subsample (five) of the 18 interviews. Each of the two coders’ results were compared and discrepancies were discussed to resolution.

B. Initial Results and DiscussionAn initial round of interviews was conducted to capture

the voice of a “defender.” This voice included defenders’ attitudes, perceptions, use of rules and tools in making decisions, as well as their interactions with computers and with one another in achieving the goals of confidentiality, integrity, and availability of the information systems.

1) What are the subjects of learning?Learning occurs dynamically amongst the interconnected

task-oriented work units, the cyber defenders’ teams (i.e., Systems Administrators) and cyber attackers’ teams (i.e., Penetration Testers), as depicted in Fig. 2. During the interviews, it became apparent that each member in the

1 Research protocol IRB-2016-19676 approved by the Florida State University Human Subjects Committee.

activity system (i.e., the cyber defenders’ teams) had his/her own unique previous experiences (historical view) with the various tools, rules, etc. as s/he performed the team activities, which introduced internal tensions and contradictions within the system. As Engeström [7] stated, instability and internal tensions are the “motive force of change and development.” The multi-voicedness of the interactions amongst defenders was reflected in the following excerpt. One of the interview Respondents mentioned that the leadership “…was half by necessity and half by self-appointment…” because of the existing technology and systems experience, and frequent process of trial and error.

2) Why do they learn?The following interview response highlights the diverse

activities, unifying factors, and tools that cyber defense teams deploy to achieve their tasks and goals. A Respondent stated that different tools were used. The defensive tools include 1) Etherape, which scans network packets from all the machines connected to the local network “…where you can see if someone is on it that isn’t supposed to be.” 2) Zenmap “…scans the system for open ports and vulnerabilities.” 3) Komodo, pfSense firewalls, and Ubuntu IPtables allow rule-based configuration to manage port security, enabling/ blocking certain packets to go through the networks. The offensive tools include the same scanners used for defensive purposes (i.e., Zenmap, Etherape), repurposed for network reconnaissance and for identification of exploitable ports.

Within each activity system (i.e., the team), as well as the larger system (i.e., the class), a learner’s historical perspective and previous experiences can also influence the learning process and have an impact on the learner’s motivation to learn. This influence is evident in the following participant’s response when asked to describe his previous cyber defense experiences. The Respondent replied, “Yes, I’d participated in a few CTF (capture the flag) events.” Participants’ past experiences help them troubleshoot other teams’ system vulnerabilities.

3) What do they learn?The participants in the study learn from many resources

in their community. As participants sought solutions to the cyber security exercises, they often engaged their instructor or peers and asked questions to clarify understanding or to fill knowledge gaps about tools, rules, and procedures. Engeström’s [8] notion of “knotworking” was also reflected in several of the participants’ responses. Knotworking is a new form of collaboration that allows the members of a team not only to coordinate their tasks and communicate about what is to be done, but also to devise solutions for emerging problems. A Respondent who participated in the Collegiate Cyber Defense Competition (CCDC), a nation-wide competition, described the goal of the simulated assignment as protection of a voting or banking system. The team was assigned the task of properly configuring and managing systems with the objective of ensuring that services should be available at all times. The team also needed to respond to business injects and operational requests, such as creating user accounts or setting up a new Web server. The Respondent stated that “A new challenge required…[the

12th ANNUAL SYMPOSIUM ON INFORMATION ASSURANCE (ASIA '17), JUNE 7-8, 2017, ALBANY, NY

ASIA '17 27

Page 6: Expansive Learning in Cyber Defense: Transformation of ... Learning in Cyber Defense: Transformation of Organizational Information Security Culture Shuyuan Mary Ho School of Information

12th ANNUAL SYMPOSIUM ON INFORMATION ASSURANCE (ASIA '17), JUNE 7-8, 2017, ALBANY, NY

ASIA '17 28

team]…to organize the chain of custody [and that all activities were] logged and properly documented according to the policies.”

4) How do they learn?An examination of the interview responses reveals how

the participants learn and acquire the needed skills to secure their information assets. The learning process is not the same for every participant in the activity system. As one Respondent stated, “[The] first resource is to straight-up trial and error. And, the next thing is, we help each other.” Members of a team assist each other to identify solutions to defend or attack vulnerable systems. The third commonly cited resource was the campus Cybersecurity Club community. Members of the Club meet on a weekly basis. A Respondent described the Club as a kind of workshop to write “…Python scripts to break an encryption…[or] to conduct a brute-force attack. We also learned as a community to enable HTTPS on a Web server so that more HTTP traffic can be more secure. The only ‘experts’ that I know are my instructor and the graduate students in the Cybersecurity Club.” The teams also use online communities for advice on programming, scripting, and best practices.

C. SummaryInitial analysis of our ongoing study suggests that AT is a

suitable framework for examining the interactions between Systems Engineers and Penetration Testers. As these subjects interact and learn, the organization also benefits through an evolving culture of security awareness. This evolution creates an opportunity for expansive learning as contradictions or tensions lead to change and expansion or development in both activity systems, which include the Systems Engineers and Penetration Testers.

VI. CONCLUSION AND FUTURE WORK

Use of AT to examine the security culture of organizational cyberinfrastructure is innovative and, based on our preliminary findings, effective. By gleaning insight into the interactions between these participants, the subsequent phases of this study will continue to illuminate the differences in goals and operations between cyber offense and defense. Future research will include additional data collection, elaboration of the qualitative coding scheme, transcription analysis, and examination of conflicts among different activity systems (i.e., teams). The effectiveness of using this methodological approach for protecting the organization’s information assets also will be assessed. The study will contribute cyber defense insights to the cybersecurity professional community.

ACKNOWLEDGMENT

The first author acknowledges the previous work, discussion, and presentation with Dr. Hwee-Joo Kam at 2016 IFIP WG8.11/WG11.13 Information Systems Security Research Workshop, and 2016 ICIS JAIS Theory Development Workshop, Dublin, Ireland. Authors also acknowledge research participants from LIS4777 at Florida State University during Spring 2017.

REFERENCES

[1] Bannon, L. From human factors to human actors: The role of psychology and human-computer interaction studies in system design, in Design at Work: Cooperative Design of Computer Systems, Greenbaum, J. and M. Kyng. 1992. Lawrence Erlbaum Associates Inc.: Hillsdale, NJ. 25-44.

[2] Bødker, S. A human activity approach to user interfaces. Journal of Human-Computer Interaction, 1989. 4(3): 171-195. doi:10.1207/s15327051hci0403_1.

[3] Carroll, J.M., Designing Interaction: Psychology at the Human-Computer Interface. 1991. New York: Cambridge University Press.

[4] Carroll, J.M. Human-computer interaction: Psychology as a scienceof design. Snnual Review of Psychology, 1997. 48: 61-83.

[5] Chen, R., R. Sharman, H.R. Rao, and S.J. Upadhyaya. Data modeldevelopment for fire related extreme events: An activity theory approach. MIS Quarterly, 2013. 37(1): 125-147.

[6] D'Arcy, J., A. Hovav, and D. Galletta. User awareness of security countermeasures and its impact on information systems misuse: A deterrence approach. Information Systems Research, 2009. 20(1): 79-98. doi:10.1287/isre.1070.0160.

[7] Engeström, Y. 23 Innovative learning in work teams: Analyzing cycles of knowledge creation in practice, in Perspectives on Activity Theory, Engeström, Y., R. Miettinen, and R.-L. Punamäki-Gitai.1999. Cambridge University Press: Cambridge, MA. 377.

[8] Engeström, Y. Expansive learning at work: Toward an activity theoretical reconceptualization. Journal of Education and Work,2001. 14(1): 133-156. doi:10.1080/13639080020028747.

[9] Engeström, Y., Learning by expanding: An activity-theoretical approach to developmental research, 2ne ed. 1987. New York: Cambridge University Press. 338.

[10] Kaptelinin, V. and B. Nardi, Acting with technology: Activity theoryand interaction design. 2006. Cambridge, MA: MIT Press.

[11] Kuutti, K. Activity theory and its applications to information systemsresearch and development, in Information systems research: Contemporary approaches and emergent traditions, Nissen, H.E.,H.K. Klein, and R. Hirschheim. 1991. Elsevier North-Holland, Inc.: Amsterdam, The Netherlands. 529-549.

[12] Kuutti, K. Activity theory as a potential framework for human- computer interaction reserach, in Context and Consciousness: Activity theory and human-computer interaction, Nardi, B.A. 1996. MIT Press: Cambridge, MA.

[13] Kuutti, K. Activity theory, transformation of work, and information systems design, in Perspectives on Activity Theory, Engeström, Y., R. Miettinen, and R.-L. Punamäki-Gitai. 1999. Cambridge University Press: Cambridge, MA. 1-360.

[14] Leontiev, A. Retsenzija na knigu: Basov M. Ya. Obschie Osnovy Pedologii [Book review: General Foundations of Pedology by M. Ya. Basov], in Estestvoznanie i Marxism, Basov, M.Y. 1929: Russia. 211-213 (In Russian).

[15] Norman, D.A. Cognitive artifacts, in Designing interaction, Carroll,J.M. 1991. Cambridge University Press: Cambridge, MA. 17-38.

[16] Siponen, M. and A. Vance. Guidelines for improving the contextual relevance of field surveys: the case of information security policy violations. European Journal of Information Systems, 2013. 23(3):289-305. doi:10.1057/ejis.2012.59.

[17] Valencha, R., J. Lee, and H.R. Rao. Privacy issues in healthcare, 2014. The Networking and Information Technology Research and Development (NITRD) Program.

[18] Vance, A., P.B. Lowry, and D. Eggett. Using accountability to reduceaccess policy violations in information systems. Journal of Management Information Systems, 2014. 29(4): 263-290. doi:10.2753/MIS0742-1222290410.

[19] Verizon. 2016 Data breach investigations report, 2016. Verizon. 1-85.

[20] Vygotsky, L.S. Interaction between learning and development, inMind and Society, Gauvain and Cole. 1978. Harvard University Press: Cambridge, MA. 79-91.

[21] Willison, R. and M. Warkentin. Beyond deterrence: An expanded view of employee computer abuse. MIS Quarterly, 2013. 37(1): 1-20.

12th ANNUAL SYMPOSIUM ON INFORMATION ASSURANCE (ASIA '17), JUNE 7-8, 2017, ALBANY, NY

ASIA '17 28