What is the Cyber Kill Chain

Throughout history, man has worked to improve the quality of life. A brief history of the 20th century reveals countless inventions from automobiles to airplanes, vacuums to microwave ovens, and contact lenses to Viagra.  Many things we use every day were once a dream in the inventor’s eye but the invention of the computer has taken us even further that any dream could ever hope.

Today, the computer is everywhere.  Computers are the tools used in banks and businesses, by engineers, scientists, and educators as well as millions of people around the world.  Computers can accomplish many tasks with extreme accuracy and speed.  We can gain a lot of information using the computer and we can store a huge amount of data on it.  We could not imagine a world without the computer but no great invention has ever come about with an element of risk.

The history of computer hacking dates back to the onset of computers.  A computer hacker is one who develops, changes or attempts to circumvent computer security hardware and software.  People hack computers for positive and negative (criminal) reasons.  Criminal hackers develop computer malware or spyware to gain access to confidential information.  This type of exploration may have started as a game but has rapidly, and dangerously progressed due to increasing reliance upon the computer.

As the business of hacking becomes more sophisticated, so has the art of defense techniques in detecting and destroying computer threats.  A new class of threats called “Advanced Persistent Threat” (APT) targets highly sensitive economic, proprietary, and national security computer networks.

Lockheed Martin is a global security company specializing in the protection of some of the most sensitive information systems in the world.  Lockheed Martin believes it is possible to understand, anticipate, and even lessen the damage based upon knowledge of that threat.  The term “Cyber Kill Chain” is the process used to describe the different stages of cyber-attack.

Each stage of the chain completes a specific step along the path to attacking a given system, these may occur in parallel or sequence of previous stages can be switch. The main strength of using the kill chain model is showing how long an attacker can progress in their attack, amount of damage, and what kind of forensic investigation must be performed as a result. For each type of attack the system administrator can ask these question: “Was this a successful breach” and “Did the attackers get to their intended goal”. A typically attack is based on how much the attack knows of how the structure and process of the system was devolved, thus the response should be based on the same structure and process an attacker might use. This allows an IT department to develop a result oriented set of security procedures to prevent attacks against the system. Yet this model does have a weakness focusing only on the perceived weakness in the system without any proof that the target is of any value to the attacker. Using the cyber kill chain and understanding the signature of an APT can help defensive harden capabilities, this includes security controls and action that can be implemented or improved to detect, deny, and contain and attack scenario.






Open Source Wireless Protocol Analyzer

A packet analyzer is a computer program that can intercept and log data traffic passing through a network. When data streams flow across a network a packet analyzer will capture each packet and decodes the packet’s raw data. Of the many open source network protocol analyzers the three most popular applications being used are Wireshark, Capsa, and Packetyzer.


Wireshark is a very popular free and open source network analyzer, and is cross platform. What makes this software application so popular to users is how easy it is for anyone to all view network traffic visible on any given network interface. Similar to tcpdump but with a graphical front end instead of command line interface, plus sorting and other advanced filtering options allowing the user to examine data more in-depth from a live network or saved packets in memory. What Wireshark cannot do is be used as a network detection system or for manipulating packets only to examining them.

Wireshark uses the application programming interface (API) pcap to capture packets, which comes from the libpcap code library for the C programming language on UNIX base system, winpcap for windows based machines. Libpcap was first developed at Lawrence Berkeley Laboratory to be used with tcpdump for low level packet capturing.

Another popular analyzer to use is Capsa which comes in a three different versions ranging in price from free to $995. This application does everything Wireshark does including real time packet capturing, constant network monitoring. But where this product does surpasses Wireshark is its advanced graphical interface that provides a clearer view of any network making the task of conducting packet level analysis and other network problems easier. What makes a tool for network administrators to use is that is costs $995 for the Enterprise edition and $695 for the Professional, which can be a hard for a starting IT budget but the free is a good way to first test before purchasing it. The Free version has a lot of the advanced features taken out and can only monitor ten IP addresses at once, and how a downtime of four hours before being able to be used again.

The last analyzer research was Packetyzer which is a very basic packet sniffer application based on the Ethereal project and provides a GUI for windows machines. This application was the same as Wireshark but did not have as nice of a graphical interface for packet capturing.

Of the three applications I researched I would recommend to any IT professional to use Wireshark as a starting to for their networking trouble shooting problems, because it is easy for anyone to use and can be use on any system for free. My second choice would be Capsa because in the event where is an network administer does encounter a more advanced network problem to look into many investing in Capsa if the problem require a more in-depth exam to find a solution.

Mobile Device security in the workplace

Access to data is no longer limited to the fixed computer workstation.  Laptops, Smartphones, and tablets give us access to files, pictures, and music from anywhere in the world.  This is especially attractive in the work place environment where mobility allows employees to check emails, access applications on the cloud, or review office documents.  Unfortunately, the idea of “bring your own device” (or BYOD) to work is creating privacy and security issues prompting questions of how much access should anyone have to a company’s network or cloud.

News of internal data leaks of office documents are all over the Internet raising concerns about how to prevent confidential data from falling into hackers or competitor’s hands.  There have been attempts to address this growing mobile device risk in the world of IT but separating the company’s and employee’s device has proven to be costly and very difficult to implement. Some companies buy mobile devices for employees yet they continue to lose the ability to cut costs, even when buying in bulk.  The company ends up paying for calls and data plans as employees claim these costs as work expenses.  In addition, implementing new network security measures to cope with the increase of new devices on the network is very costly.  The company’s IT department must spend more money and other resources on mobile data protection, network access control, and device management.

One solution is to implement a Virtual Mobile Infrastructure (VMI) where a user can access virtual mobile operating systems that are running on the company’s server without putting the company data at risk.  Employees and users have access to two operating systems on their mobile device; one dedicated to the company server and the other for personal Internet access.  An example of how an employee or IT administrator can use this concept is to run one or more virtual machines with Android application in data centers and deliver the application data to any location

Headjacks and Neural-interfaces

Headjack are a a small data port created on synthetically-grown humans who become connected to the neural-interactive virtual reality known as the Matrix. The process of “Jacking In” involves the connection of a headjack to the communication network of a hovercraft so as to materialize within the Matrix or Construct as an avatar.


This comes from the Movie “The Matrix” (1999) is a science fiction film written and directed by The Wachowski Brothers. It depicts a dystopian future where the reality that is perceived by humans is actually a simulated reality called “the Matrix.” The Matrix was created by sentient machines in order to subdue the human population.

Headjacks are located at the base of the skull, just above where the neck meets the back of the head. It provides room within the otherwise close quarters of the human cranium for a data probe. Communication by the headjack is a mixture of electrical and electromagnetic transmissions. Where the headjack is the central interface, thousands of tiny electrodes meld into and throughout the brain, or weave throughout the central nervous system at the brain stem.

When an individual has a headjack inserted in the Matrix or other virtual location, their body is vulnerable to physical harm and tampering. Notably, if a headjack is removed by someone in the Real World before the individual can “Jack Out” of the Matrix, they suffer instantaneous death by unknown cause, presumably massive neural damage

In real-life, researchers at MIT have encountered an interface that could allow a computer to plug directly into the brain. These new fibers are less than a width of a hair. They claim such a system could deliver optical signals and drugs directly into the brain, along with electrical readouts to continuously monitor the effects of the various inputs. While a single preform a few inches long can produce hundreds of feet of fiber, the materials must be carefully selected so they all soften at the same temperature. The fibers could ultimately be used for precision mapping of the responses of different regions of the brain or spinal cord



  • The new fibers are made of polymers that closely resemble the characteristics of neural tissues.
  • Multifunction fibers deliver optical signals and drugs directly into the brain, along with electrical readouts to continuously monitor the effects of the various inputs.
  • Combining the different channels could enable precision mapping of neural activity and ultimately, treatment of neurological disorders that would’ve not be possible with single-function neural probes.







Worm (Write Once Read Many)

A worm (Write Once Read Many) is a standalone type of malware software program that can self-replicated itself in order to spread to other computers or networks in emails, instant messaging, IRC chat, Peer to Peer network connections. Some of the more modern tends in worm mitigation techniques are using packet filters, ACL’s in routers and switches, and lastly null routing.

The one of the first computer worm attacks which was sent over the early versions of the Internet infecting nearly 10% of UNIX computes which belong to NASA, Berkley, MIT, Stanford, and the Pentagon. Release in November 2nd, 1988 was called the “Morris Worm”, named after its designer Robert Morris. Studying as an undergraduate at Cornell University experimenting with self-propagating programs, he choose to release the worm from MIT to disguise the fact it was created at Cornell. Once Robert Morris realized the extent of damage the worm was doing to the Internet he contacted a friend at Harvard to discuss how to stop the worm. The worm took advantage of a hole in the debug mode of the UNIX sendmail program to mitigate through the network.

The main difference between a virus and a worm is that worms do not need to attach itself to an existing program to infiltrate a system, whereas virus attach themselves to files and require user interaction to infiltrate the computer. Worms use a networks to travel from one computer to another without any user interaction. Worms can be programmed with a payload, code added to the worm to do more than just spread the worm, which can do any of the following:

  • Delete files
  • Encrypt files (Cryptoviral extortion)
  • Send Documents in an email
  • Install a backdoor on a computer

When a worm installs a backdoor on a computer it becomes a “zombie”, which comprises the computer and can be used remotely to perform any type of malicious task.

Not all worms are bad. There has been a lot of research over the years to designs “good intention” worms which can be used as network diagnostic programs. When research started to learn more about worms and how they spread in order to create non-malicious worms. John Shock and Jon Hupp of Xerox, researched and designed a worm to allow testing of Ethernet principles on their internal computer networks. Another type was the Nachia worm, which exploited a vulnerability in the Microsoft Remote procedure call (RPC) service to search for installed malware on a system then tried to install a security patch from Microsoft to prevent any further infection.

Programmers and network penetration testers can use Worms for either good or malicious purposes. No one can protect themselves of every type of malware or network worm, but with basic computer knowledge and anti-malware software installed on their computer, any user can protect themselves from most types of malware attacks.

External Sources






Free Online Operating System Tutorials

Few free online tutorials about what operating systems are and what they can do.

Tutorials Point

This site had 15 sections about what an operating system is and all the different parts, including a section on Linux. The target audience of this tutorial was computer science graduates to help them understand the basic to advanced concepts related to Operating System. One thing I liked best about this website was a section with OS Exams Questions with Answers to help with the learning process.


W3schools is an origination of teams of professional experts in various fields of designing and software application development. This Operating System tutorial has 16 sections all the main concepts of what an OS is, and one final section that briefly described Linux and what a kernel was. This tutorial was more on the short side and just scratch the surface of the basic concepts of what an operating system is and what it can do.


Studytonight provides free and easy education on the Internet, with the goal of working towards bringing the entire study routine of students on the Internet. This course had 3 modules: introduction, process & multithreading, and memory management. It also had topical test and Q & A form to help student with the course and learning Linux.