The Cloud And Remote Agent Based Code Auditing And Reporting Framework

'The Cloud and Remote Agent Based Code Auditing and Reporting Framework' is the appointed name for the global presence and the code name 'Project Mahabarana' is the Sri Lanka appointed name. This project is vitally dedicated for Military, Intelligence and Police as such critical organizations and it is not much use for the general public or private sector (but sometimes it may be useful for the global business domain). In this project, the key outcome is the revealing the backdoors or other illegal back links such as rootkits or any form of malware programs hidden inside the original software applications which is coded using shrink wrap code hacking technique by analysing and auditing the disassembly code of the original binary source file/files. In the contemporary time epoch there are many anti-privacy, anti-security, espionage and illegal authoritative activities popping up from all over the world. Sometimes the major news revealed and whistle-blowing, the billion dollar multinational/blue-chip software companies were given top-secret backdoor access to government agencies NSA, CIA and FBI as such intelligence organizations and it is truly violating the customer privacy or intellects. This project is challenging their unauthorized/authorized illegal spying/espionage or other illegal activities which means before we purchase such expensive software applications to the organizations, we proceed the entire legal code (disassembly code) auditing of the entire binary application and try to find out whether the specific software is backdoored or safe to deploy in the organization. This entire project is based on the theory of 'Prevention is better than cure'.


NSA - National Security Agency
FBI - Federal Bureau of Investigation
CIA - Central Intelligence Agency
EULA - End-User Licence Agreement
DCMA - Digital Millennium Copyright Act
SSB - Software Setup Binary
NIST - National Institute of Standards Technology
CERT - Computer Emergency Readiness Team
PC - Personal Computer
PE - Portable Executable
UPX - Ultimate Packer for Executable
CPU - Central Processing Unit
DoS - Denial-of-service
DDos - Distributed Denial-of-service
TCP - Transmission Control Protocol
UDP - User Datagram Protocol
C&C - Command and Control Server
IRC - Inter-Relay Chat Channel
IT - Information Technology
VoIP - Voice over IP
MUP - Manual Un-Packing
FUD - Fully Un-Detected
USB - Universal Serial Bus
CD - Compact Disk
DLL - Dynamic Link Library
TLS - Thread-Local Storage
SYS - System Specific
EFS - Encryption File System
GUI - Graphical User Interface
LZMA - Lempel'Ziv'Markov chain Algorithm
AES - Advance Encryption System
SQL - Structured Query language
SEO - Search Engine Optimization
FON - Font
TLS - Transport Layer Security
IANA - Internet Assigned Numbers Authority
IRCd - Internet Relay Chat Daemon
APT - Advance Persistence Threats
VM - Virtual Machine
HTML - Hypertext Markup Language
CSS - Cascading Style Sheet
JS - Java Script
VNC - Virtual Network Computing
SSL - Secure Sockets Layer
SSH - Secure Shell
LAN - Local Area Network
FDR - Flight Data Recorder
CVR - Cockpit Voice Recorder

1.1. Overview of the Project
The Project Mahabarana (The Cloud and Remote Agent base Code Auditing and Reporting Framework) is a truly exclusive copyrighted and innovative conceptualized software framework from the beloved motherland Sri Lanka. Mainly in this project, we will discuss the major issues of involving in the areas of Information Technology and Information Security. Many companies and public/private organizations uses the commercial software applications by paying the mammoth money to the global software vendors. Mostly those companies who are willing to buy theses software applications does not consider much about their privacy/confidentiality due to the great and an utmost trust on those global software vendors (Ex. Microsoft, Symantec etc.).

A rock solid example due to the aforementioned serious violations and illegal activities in the business arena, presume an organization is going to purchase Microsoft Office Suite genuine licenses from Microsoft by consuming the mammoth cash load. Even though they are going to occupy such a vast amount of money, the Microsoft is severely violating an every privacy right of the customer by providing an invisible top-secret backdoor access (silently leaking the most of customer's utmost sensitive, private and confidential data/information) to the strong realm's intelligence establishments as such NSA, FBI etc. by restricting the customer or the end-user to the solid stated legal EULA. It is an utmost and unethical proceeding in the civilized commercial business domain. Because they are building an illegal line of attack by framing the end-users or the clients in a strict legal (intellectual property) framework under the governing law suits of the global realms. '
To install/deploy whichever Microsoft or Non-Microsoft software application, the end-users at first must 'Agree' with their private EULA stated or embedded in the software installation (Software Setup Binaries) itself. The EULA is strictly restricting the end-user to not proceed any 'modification, tampering and/or reverse engineering including the decompiling, disassembling and/or any other in-place activities' on their own private binaries (files or binary resources) and the EULA is mostly established on the global DCMA (Digital Millennium Copyright Act) and supplementary governing matured intellectual property laws/acts in the business world. The violators of the aforesaid EULA's will be prosecuted legally under the maximum possible extent (including huge financial claims as for the legal penalties). In such manner, the Microsoft or supplementary governing authorities of the business software domain strictly violating every business right, information right, domestic right, privacy right as well as the human/personal privacy right. Therefore I firmly argue with them by stating the subsequent questions,

a) Who will be prosecuted them for their business ethics and rights violations?
b) Who will be legally charged them for aforesaid rights violations?
c) Who will be paid the customer's damages due to these illegal movements?
d) What are the successful and positive resolutions to evade or diminish these types of illegal movements done by the commercial software vendors?

My entire project dissertation is constructed on those aforementioned serious series of questions, to precisely attack these injustice activities, to accomplish a reasonable unbiased justice for the parties who are going to be the victims of all their illegal activities. In my project mainly helps to get 99.9% sure rate of answer in the aforesaid questions.
1.2. Motivation of the Project
In the past nobody's ever introduced a theory, software (tool/utility) or conceptualized framework for such issues of illegal spying/espionages done by those parties who are belongs to the strong realms. I understood the gravity of those issues and I revolutionized this novel conceptual framework to diminish these types of unethical proceedings and approaches. I hope my solid effort will help thousands of innocent victims who trapped inside these illegal activities done by the global authoritative top-secret governing organizations. Initially it is still in the childhood and in next couple of years it will show the rapid growth with the support of all future academic researchers in all over the world.

1.3. Project Intentions
Basically the key objective of this project is to develop a cloud and remote agent based code auditing and reporting framework system software to educate the thousands of innocent victims who trapped inside these illegal activities done by the global authoritative top-secret governing organizations by delivering the appropriate intellects/knowledge among their management and technical staff to diminish, prevent and eliminate such unethical situations in the business domain.
2.1. Literature Review and the Purpose
The literature review is an evaluative report of readings originate in the literature associated to the designated subject area or area of expertise. The determinations of the literature review would be deliver a context for the research, express that the work is counting to the understanding and knowledge of the pitch, aid enhance, refocus or even amend the topic, substantiate the research, indicate where the research turns into the prevailing body of knowledge, climax flaws in former researches, empower the researcher to study from former concept on the specific subjects, make sure the research has not been done before, demonstrate how the topic has been studied up to that time and outline gaps in former researches. It is goes beyond the exploration for information and comprises the identification and articulation of associations among the literature and the field of research.

2.2. The Cloud Computing in a Nutshell
The NIST (National Institute of Standards Technology) of (Mell and Grance, 2011) mentioned 'The Cloud computing is a shared network and it will be model for enabling globally appropriate and on-demand network access. Then it can be at least managing strength or service provider interaction.' The US-CERT supported the above statement (Alex and James, 2011), 'The Cloud computing is a service it depend on the where you can attain networked storage, space and computer resources'. Then my solitary view of the Cloud computing as same as the US-CERT's and NIST's definition. As an example, about the Google Drive is consuming Cloud Computing concept in a successful manner. In the extensive explanation of the Google Docs on the Google Drive framework, a document can be created in the Google Docs using Virtually Hosted Cloud App (Word Processor ' Same as Microsoft Word) and after the creation of the aforesaid document, you can easily and directly upload it to the Google Drive for sharing the resources among others. There you can set the permission to access level of the aforesaid document among the community as public or private groups etc. Then, the permitted crowd can easily access the saved document from their mobile devices (Ex. PDAs, TABs, and Smart Phones etc.) or PC, Notebook, Laptops or Netbooks.

2.3. PE-Packing/Unpacking and PE-Crypting/Decrypting
PE acronym of Portable Executable which can be identified as the native executable file format for Microsoft Windows operating system. At this point, (Berg, 2011), 'Packers are programs designed to compress (pack), and sometimes encrypt (crypt), the contents of an executable files. These obfuscation techniques (crypting) works by compressing (packing) the executable and obfuscating (crypting) its contents that ends in a new executable. Before the executable is loaded into memory its contents will pass through a decompression (unpacking) and de-obfuscation (decrypting) routine that extracts the program into memory'. There is another supportive view for the aforesaid, (Khalil, 2014), 'A packer is proposed to reduce file size at first. A packed executable file is a file applied packer. This packed executable file operates functionally same as original file.' As a classic example, (UPX, 2013), 'UPX achieves an excellent compression ratio and offers very fast decompression. Your executable suffer no memory overhead or other drawbacks for most of the formats supported, because of in-place decompression'. Most of the time these PE-Packers and PE-Cryptors used by the malware authors (virus writers) to evade its detection from the Anti-Malware products.

2.4. Sandboxing
In the definition of, (Geier, 2014) 'Sandboxing is a form of software virtualization that lets programs and processes run in its isolated virtual environment. Typically, programs running within the sandbox have limited access to your files and system, and they can make no permanent changes. That means that whatever happens in the sandbox stays in the sandbox'. As a classic example, (Sandboxie, 2012), 'Sandboxie is a product which runs your programs in an isolated space which prevents them from making permanent changes to other programs and data in your computer'.

2.5. Reverse Engineering
The (Eilam, 2005) mentioned 'Reverse Engineering is the process and re-analysis the design blue-prints from anything man-made'. The supportively explanation by, (Janssen, 2014) 'Reverse engineering is a computer programming technique used to analyse software in order to categorise and understand the parts of the above software. The usual reasons for reverse engineering a slice of software are to reconstruct the program, to build something similar to it'. Sometimes in illegal manner the Reverse Engineering is often viewed as the craft of the cracker who uses his skills to remove copy protection from software or media. Also Reverse Engineering is the process of discovering the technological principles of a device, object or system through analysis of its structure, function and operation. The importance of reverse engineering in the field of Information Technology is rapidly increasing due to the growth in field of Information Security Sector. The positive applications of Reverse Engineering are the understanding the capabilities of the product's manufacturer, understanding the functions of the product in order to create compatible components, determining whether vulnerabilities exist in a product and determining whether an application contains any undocumented functionality.

Reverse engineering often is done because the documentation of a particular device/software/system has been lost (or was never written), and the person who built it is no longer available. Ex. Integrated Circuits often seem to have been designed on obsolete, proprietary systems, which means that the only way to incorporate the functionality into new technology is to reverse-engineer the existing chip and then re-design it. It enable finding out how a product works and its main components. When applied to information security, Reverse engineering aids in security auditing to check for fraud. The application of reverse engineering will be highlighted using a series of informative demonstrations.

2.5.1. Ethical Reverse Engineering
A Reverse Engineer may carry out reverse engineering to mitigate the failure to check for error conditions, poor understanding of function behaviours, poorly designed protocols, improper testing for boundary conditions and bugs and Flaws of a software. Also it can be used to measure the skills of a Software Engineer/Coder, to verify the Security Level of a Program, to verify the Program is not infected by Malware Codes and to enhance the CPU and Memory Management of a software. These Reverse Engineering technologies are used by following audiences Software Engineers/Developers, Software Architects, Information Security Experts, Malware Analysers (Malware Forensics Experts), Malcode Scientists, Code Auditors and Reviewers, System/Device Auditors, Digital Forensics Investigators and Antivirus Researchers. Few Reverse Engineering real world scenarios are, Analyse Network Traffic and Dump into a Binary RAW Files or Reversing the Systems Crash Dumps, This particular process is very useful for Telecom Engineers/Digital Forensics Investigators/Govt. Intelligence/Police and Military, Malware and Malcode Experts can easily identify the architecture of the Malcode binaries and attack methodologies of the Malcodes and By Reversing the Existing Software, Software Engineers/Architects can develop an Extremely High Secured Enterprise Class Applications/Software by using Anti-Reverse Engineering Techniques.

2.5.2. Reverse Engineering (OMG) ' Illegal Thing
The (Hoglund and McGraw, 2004) mentioned 'Reverse Engineering can be used to reconstruct source code and it gaits to a fine line in intellectual property law. Many software license agreements strictly prohibit reverse engineering mechanism then Software companies fear for their trade secrets algorithms and methods can be directly revealed through reverse engineering'. In my argument, even though the reverse engineering associated most global explanations are pointing to the illegal manner/aspect, I firmly criticize their views on the illegal manner and I firmly argue that the reverse engineering is not an illegal task. It is in a solid manner depends on the people's individual intellects and their own views.

2.6. Botnet
The (Ferguson, 2010) mentioned 'A botnet refers to a network of bots or zombie computers widely used for malicious criminal activities like spamming, Distributed Denial-of-Service (DDoS) attacks, and/or spreading malware variants. A botnet connects to Command-and-Control (C&C) servers over the TCP Port 6667 (IRC) mostly, enabling a Bot-Master (Bot-Herder) or controller to make updates and to add new components to it'. In this statement supportively (Margaret, 2012) mentioned 'A botnet (zombie army) is a number of Internet computers that, although their owners are uninformed of it, have been set up to forward transmissions (including spam or malware) to other computers on the Internet. Any such computer is referred to as a zombie - in effect, a computer "robot" or "bot" that serves the wishes of some master spam or malware originator. Currently pose the biggest hazard to the Internet'. In my argument, even though the botnet associated most global explanations are pointing to the illegal manner/aspect, I firmly criticize their views on the illegal manner and I firmly argue that the botnets are not an illegal software types. It is also in a solid manner which depends on the people's individual intellects and their own views.

3.1. Introduction
Before we start the system design, system analyses and system development extends, we have to gather user requirements precisely. Based on the successful requirement analysis only we can create a successful customer oriented software applications. No software can gain a successful ratio in the field of software development without accumulating the customer requirement (data collection). Therefore in this project we also followed the same software strategy.

3.2. Information Gathering
In this project the key whistle-blowers are Mr Edward Snowden and Mr Julian Assange of WikiLeaks. This project may never be introduced to the world without their brave exposure proceedings. So I greatly appreciate their countless effort to the whole world and my heartiest and decent wishes/honourable salutations flying outs to those bravery solves.

3.2.1. Project WikiLeaks and Mr Julian Assange
WikiLeaks is an online, non-profit, journalistic project which allocates top-secret information, information leakages, and confidential media from anonymous sources. Julian Paul Assange (born 3 July 1971, Townsville, Queensland) is an Australian publisher and journalist and best editor in chief of the whistle-blower website of WikiLeaks, which he founded in 2006 after an previous career in hacking and programming. WikiLeaks accomplished specific reputation in 2010 when it published U.S. military and political documents disclosed by Chelsea Manning. Even he was published many of Information Security (IT) related documents in his WikiLeaks website main page. Mainly title in Spying (Espionage), Backdooring and such a many things.
3.2.2. Whistle-blower Mr Edward Snowden
He is major character in my Project Mahabarana. Edward Joseph Snowden (born June 21, 1983) is an American computer specialist and former employee of the Central Intelligence Agency (CIA) plus former defence contractor for the National Security Agency (NSA). He came to international consideration when he discovered stacks of confidential documents to few media passages. The announcement of confidential material was called the most significant leakage June 5, 2013 publicised Internet surveillance programs such as Project PRISM, Project BULLRUN, Project MUSCULAR, Project XKeyscore, Project Tempora and the bulk pool collection of US and European telephone metadata. The reports were based on documents Snowden disclosed to The Guardian and The Washington Post while employed by NSA contractor Booz Allen Hamilton.

According to the Edward Snowden, he was surely told US Government backdooring and spying all of security related things and they read all data. As an example: Microsoft Corporation legally agreed they are all software's backdooring FBI an NSA. Other examples can explain real world examples section. In latter on journalist said NSA and FBI going to spying android mobile apps and phone in back end more hardly. Even they spying the user activities over the most popular and famous game called 'Angry birds'.

Illustration 3.1: Project Prism - Tasking Process (Leaked)
Also sometimes there might be a possibility to spy the email communication between two parties. Sometimes even VoIP encrypted communication (Ex. Skype) can also eavesdrop by those secret organization. By using secret man-in-the-middle backdoor access. When we install software applications, there is a place to agree or disagree to their private EULA. In such EULAs there will be no close or sections stated regarding these illegal top-secret backdoor access. It is also an unethical proceeding and most of uncivilised proceeding where we need to keenly consider. In the EULA, there are thousands of rules and regulations are set by the software vendor to the end user. And the end user must agree all those terms and conditions. My precise intelligent question is 'Who will be responsible for such customer privacy violation (data leakages through top-secret backdoor access) and who should be prosecuted legally for such violations'? I firmly criticize their unethical and uncivilised approaches with the global authoritative top-secret organization such as NSA, CIA and FBI. Therefore I bravely argue with those software vendors in this manner ('If they sell their proprietary software applications with the silent legal/illegal backdoors to the customers or end-users without notifying them to support the global intelligent organizations, the end-users also should have a solid right to decompile, disassemble there proprietary software binaries to the disassembly codes and check whether those backdoors and associated malware codes are embedded with the original binaries. If such malicious code found in the original binaries the end-user or customer should have a sole right to not to purchase the appropriate software to their internal workflow).' Real world Examples
According to the WikiLeaks and Snowden details analysing, we can understand NSA and FBI try to create cyber restrictions from their own portraits. And the entire planet is focused in this recent time period on such illegal cyber espionages and/or spying by the resilient prevalent governments in the whole world for the sake of information freedom. The subsequent specific examples provides and proves the wide scaled broader picture of the aforementioned illegal activities done by those powerful governing authorities in the world.

1. Cisco Backdoor still open - IBM researcher at Black Hat says opening for Feds exposes us.
2. Snowden leak: Microsoft added backdoor for Feds - NSA praises Redmond for 'Collaborative Teamwork', Microsoft handed the NSA access to encrypted messages.
3. NSA Cracks Internet Crypto - MasterSpy, FBI Open-BSD Backdoors and RSA Cipher Vulnerability.
4. German Government Warns Key Entities Not To Use Windows 8 ' Links The NSA.
5. The NSA's Infatuation with "backdoor" Penetration and CISCO Lawful Interception - Cisco Patent: Policy-based content intercept'.
6. USA Government starting PRISM (Planning Tool for Resource Integration, Synchronization, and Management) Programme for Foreign Intelligent Surveillance.
7. Facebook ma lawfully disclose aggregate data regarding any orders and/or directives that Facebook may have received under the Foreign Intelligence Surveillance Act.
8. Microsoft legally agreed about in NSA and FBI direct communicating users database without offering definitive proof.
9. The Project BULLRUN ' it is made for defeat the encryption used in specific network communication technologies. (Simple meaning is anyone cannot produce unbreakable encryption algorithm. If someone created but it cannot out like a product without permission in FBI?).
10. Project Carnivore ' Top-Secret FBI Project Wiretapping and Eavesdropping.

Illustration 3.2: Project Prism ' Collection Details (Leaked)

Illustration 3.3: Project Prism ' Case Notations (Leaked)

Illustration 3.4: Project Prism ' FAA702 Operations (Leaked)

3.3. Development Methodology
3.3.1 Overview
The Project Mahabharata is drawn a straight-upstream line of attention to the most present-day issues existing in the world and those are the most indispensable level of burden desires. For the first time it is utmost difficult task to understand the entire conceptual framework due to its nature and level complexity. Therefore for the first time a person who is trying to understand this project without having comprehensive knowledge in high-tech information security background might get tons of problematic delicacies and sometimes may never be able to understand.

3.3.2. Sheep-Dip Environment and Testing Methodology
Sheep-Dip check, basically same as the malware checking system. The Sheep-Dip work process is analysing the malware which is embedded in the binary files. These type of technologies are used by big business, test labs, universities, military, etc. Mainly target USB malware, hardware malware, and software malware, Sheep-Dip checking those specific hardware and software programs in a diversify manner. To prepare Sheep-Dip environment we might need to check and obtain (purchase) top rated at least 10 number of different anti-malware products by checking the unbiased global ranking. Then we create 10 numbers of virtual machines inside a high-end server system. Each virtual machine consist of a common operating system and the anti-malware product which we purchase previously.

After the successful preparation of sheep-dip environment we are going to check the suspicions software applications and hardware base (USB, CD, DVD, Blu-ray disk, etc.) binary application inside the sheep-dip environment for known or unknown malware detection prior to its deployment in the production environment. The reason we select at least 10 different anti-malware products, we have a fair chance to detect any form of malware due to diversify malware variants on their anti-malware database.

3.3.3. Sandbox
Sandbox is a virtualised application isolation program developed for x86, x64 Unix/Windows platforms. It is working like isolated operating environment and sandbox run or installed without permission modifying the local or mapped drive. It is an isolated virtual environment allows controlled testing of untrusted programs, Malware and web surfing. We can protect our machine without malware infections. Sandbox like a simple box.

Illustration 3.5: Sandboxing Methodological Architecture

3.3.4. Reverse Engineering
(Eldad, 2005) 'Reverse engineering is the process of extracting the knowledge or design blueprints from anything man-made'. In other words reverse engineering is a process where we need to see the internal workflow of any product and acquire the secret hidden operations as well as the total system or software architecture. This approach sometimes identified as an illegal form of activity by the professional domain. But sometimes essentially must need to do this reverse engineering in a legal manner (Ex. In computer forensic).

3.3.5. PE (Portable Executable ' EXE File Format) and Its Runtime
Basically PE meaning is Portable Executable. This format is a native Microsoft Windows file format for executable, object code, DLL (Dynamic Link Library), FON (System Font) files, and others used in 32-bit and 64-bit versions of Microsoft Windows operating systems. The PE format is a data structure that encapsulates the information necessary for the Windows OS loader to manage the wrapped executable code. This includes dynamic library references for linking, API export and import tables, resource management data and thread-local storage (TLS) data. On NT based operating systems, the PE format is used for EXE, DLL, SYS (device driver), and other file types. PE Files are the most exploitable entry point of the Windows System.

Illustration 3.6: Basic PE File Architecture

A PE File has major components as its internal stack model, PE Header, Object Table, Export File Layout, Export Directory Table Entry, Export Address Table Entry, Export Name Table Entry, Import File Layout, Import Directory Entry, Import Address Table Format, Import Hint-Name Table, Thread Local Storage Layout, Thread Local Storage Directory Table, Thread Local Storage CallBack Table, Resource File Layout, Resource Table Entry, Resource Directory Entry, Resource Directory String Entry, Resource Data Entry, Fixup Block Format, Fixup Record Format and Debug Directory Entry.

Illustration 3.7: Advanced PE File Architecture

Illustration 3.8: Advanced PE File Internal Process Architecture

Illustration 3.9: Understanding PE File Execution Process in Microsoft Windows PE Architectural Weaknesses (Failures)
Foreign Code Injection Allows, Buffer Overflow Allows, Buffer Underrun Allows, Section Dump Allows, Partial and Full Segment Dumps Allows, Resource Dump/Modification Allows, Debugger Allows, Disassemble Allows, File Memory Chunk Allows, Runtime File Dump Allows, CRC & Checksum Changes Allows, Memory Leakages, PE File Architectural Changes Modification Allows, Serial numbers (Registration Information) Sniffin'/Phishing Allows, Import and Export Info Open, PE Header Changes Allows, API Redirection and API Spy (API Sniffin'/Phishing) Allows, Trojan/Virus and PE File Binders Code Injection Allows, Binary Patching Allows, File Runtime Memory Sniffin'/Phishing Allows, Online Binary PE Modification Allows, Version Changes and File Architectural Conversion Allows and Program Hijacking Allows.' Resulting Incidents due to the PE-Weaknesses
Serial numbers can be extracted of commercial software, Keygens/Ketmakers can be coded for commercial software, Trial/Nag Screen/Registration (Patch/Crack) can be coded for software, Resource modifications of commercial software, Resource extraction of commercial software, API Debugging/De-compilation/Reversing and Re-Compilation can be done, Commercial software can be Hijack at runtime/offline (Toolbars), Commercial software original source can be reverse engineered, Virus/Trojans and other malicious foreign code can be executed with commercial software and Steal valuable information by using Steganography methods.

3.3.6. Decompiler/Disassembler and Decompiling/Disassembling PE File
A Decompiler takes as input an executable file, and attempts to create a high level, Compilable, possibly even maintainable source file that does the same thing. It is therefore the opposite of a compiler, which takes a source file and makes an executable. A general Decompiler does not attempt to reverse every action of the Decompiler, rather it transforms the input program repeatedly until the result is high level source code. It will not recreate the original source file.

A Disassembler is the exact opposite of an assembler. Where an Assembler converts code written in an assembly language into binary machine code, a Disassembler reverses the process and attempts to recreate the assembly code from the binary machine code.

The purpose is to Decompile/Disassemble the binary would be, recovery of lost source code (by accident or via a disgruntled employee), migration of assembly language applications to a new hardware platform, translation of code written in obsolete languages no longer supported by compiler tools, determination of the existence of viruses or malicious code in the program and recovery of someone else's source code (to determine an algorithm for example).

Illustration 3.10: Convert Machine (Binary) Code to Assembly Code

Illustration 3.11: Hexadecimal Representation of Machine (Binary) Code


Illustration 3.12: Disassembly Code Representation of the Binary

Illustration 3.13: Decompiled Code of the Disassembled Code

Illustration 3.14: Data Rescue IDA-Pro PE Disassembling in Action

Illustration 3.15: Data Rescue IDA-Pro Live Data Flow Graph of PE File

Illustration 3.16: BDASM Disassembling PE File (ASM Code, HEX and ASCII)

Illustration 3.17: NuMega/Compuware SoftICE - Kernel Mode Disassembly

Illustration 3.18: NuMega/Compuware SoftICE - Kernel Mode Deep Drill Down

Illustration 3.19: HIEW (Hacker's View) ' Linux ELF 64 Bits Disassembly

Illustration 3.20: HIEW (Hacker's View) ' Windows PE 32 Bits Disassembly

3.3.7. Cryptor and Packer
The Cryptor is basically command line/GUI application and it is coded by in numerous programing languages. Cryptor always encrypting the binary executable files and inject into the container (any factor that can contain other modules inside itself) and this container is also code as stub. Main methodology is to provide a precompiled container binary and injection of an input file into a binary container also. Because the PE header of the container has to be comprehensively modified.
The Packer is essential software vendors want to protect their binary executable files. The usual reason is help to protect the code/data inside and compression characteristically does this. This compression method we called Executable Data Compression. The reason behind this compression is to prevent from reverse engineering of the binary executable and protect the resources.

3.3.8. PE-Crypting and PE-Compacting
Somehow in this project we are using these aforesaid PE-Crypting/Decrypting and PE-Packing/Unpacking methods in some of the instances where. Accordingly that PE-Crypter (Hyperion) is mainly affected to the PE header. Anyway UPX (Ultimate Packer for Executable), Aspack and PECompact are the classical examples for aforesaid and are having top level compression ratio in the PE related native executable for Microsoft Windows platform. Packing technology is also called as compression. Just like ZIP, CAB, RAR, etc. there are many types of packers are available in the world as Freeware, Open Sources FOSS as well as commercial-ware, and sometimes reverse engineering community (cracking community) also coding such types of packers, unpackers as well as cryptors and decryptors to the cracking domain in the world. Each and every packer/cryptor has its own crypting or packing algorithm and some are use standard algorithms such as LZMA, PPMD, BCJ, BCJ2, BZip2, Deflate, 7z, GZIP, ARJ, LZH, Z, CPIO, RPM, PKZip, WinZip and DEB and non-standard algorithms.
3.3.9. Hash Basket Server
A hash is a mathematical function and it is a functional map to the algorithmic calculation to the fixed length. The values construed by a hash function are called hash values, hash codes, hash sums, checksums or simply hashes. All the generated hashes will be fed in to a discreet SQL (Structured Query Language) or NOSQL (not only SQL) server database to preserve the binary file's (compressed/encrypted) integrity level.

3.3.10. Cloud Process
More about Cloud computing, it is computing that involves a large number of computers connected through a communication network such as the Internet, similar to utility computing. In science, cloud computing is a synonym for distributed computing over a network, and means the ability to run a program or application on many connected computers at the same time. Network-based services, which appear to be provided by real server hardware and are in fact served up by virtual hardware simulated by software running on one or more real machines, are often called cloud computing. Such virtual servers do not physically exist and can therefore be moved around and scaled up or down on the fly without affecting the end user, somewhat like a cloud becoming larger or smaller without being a physical object.

Illustration 3.21: Cloud Basic Architecture

Illustration 3.22: Cloud Stacked Architecture

Illustration 3.23: Cloud Vendor/Service Architecture

Once it interconnected to one network called a Cloud. Cloud Computing provide many public, private and hybrid services. As an example of Google, Amazon, Apple. Oracle Cloud, Salesforce, Soho and Microsoft Azure are some famous and renowned cloud vendors. In our project we are going to upload the package binaries to one of these selected Cloud or Clouds. Each folder will be associated with a unique diverse 'User Name' and 'Pass Code'.

' Smart Soft-Grid Based Symmetric Multi-Processing is in the Cloud
Symmetric Multiprocessing is the combination of thousands of powerful computers that forms Super Computer capacity. Ex. Google Search Engine (Written in Python Based Spidering Bot). Following illustration shows how the Symmetric Multiprocessing functions in the cloud (example only).

Illustration 3.24: Symmetric Multiprocessing Architecture Key Returns of the Cloud Computing
a) More competent and rapid distribution of services/facilities at a sustainable inferior cost (Less Capital Expenditure and Operational Expenditure).

b) Augmented data/information security and privacy comprising the data integrity management (Confidentiality, Integrity and Availability).

c) Can be transformed the remote storage capacity in accordance to the prerequisite of the service/user capacity planning.

d) Enhanced mobility, uninterruptible service (maximum uptime Ex.99.9%), localization and customizability.

e) Typically comparable to the Super Computers (nearly 89% compatible with Super Computers).

3.3.11. Authentication Database Server (Auth-Database)
Every Cloud folder protected with an authentication credential and permission and it will be converted as MD5 based Salted Hashes or SHA1 (256/512 bits) or AES (256 bits) Hash sets and saved inside a SQL or No-SQL Database. For the reason that we might not trust the cloud service vendors in the world, because in our previous examples shows almost all commercial Cloud service vendors are backdoored (Ex. Google, Microsoft etc.) and they surely spying/espionage customer data in accordance to the WikiLeaks (Julian Assanage) and Edward Snowden. Also based on their revelations those authoritative illegal governments can always hijack our cloud based system and steal or sneak-peek into our private and confidential data in accordance to NSA's Project Prism and they may decrypt any commercial/open source encryption algorithm as well in accordance to the NSA's Project BULLRUN. Hashed (Digest) Passwords
A cryptographic hash function is a hash function that takes an arbitrary block of data and returns a fixed-size bit string, the cryptographic hash value, such that any (accidental or intentional) change to the data will (with very high probability) change the hash value.
The data to be encoded are often called the message, and the hash value is sometimes called the message digest or simply digest.

Illustration 3.25: Plain-Text to Digesting

Illustration 3.26: Plain-Text to Digesting Salting Hashed Passwords
A salt is random data that is used as an additional input to a one-way function that hashes a password or passphrase. The primary function of salts is to defend against dictionary attacks and pre-computed rainbow table attacks. A new salt is randomly generated for each password. In a typical setting, the salt and the password are concatenated and processed with a cryptographic hash function, and the resulting output (but not the original password) is stored with the salt in a database. Hashing allows for later authentication while defending against compromise of the plaintext password in the event that the database is somehow compromised.

Illustration 3.27: Salted Hashed Passwords

3.3.12. Bots and Botnets in a Nutshell Botnets
The word botnet is a combination of the words robot and network. The term is usually used with a negative or malicious connotation. Contemporary bots are having AI (Artificial Intelligence) ' neural based intelligence for specific task oriented decision making. Botnet is the collection of bots inter-connected through the internet via global/private IRC channels. In other words, a botnet is a collection of Internet-connected programs communicating with other similar programs in order to perform tasks. This can be as mundane as keeping control of an Internet Relay Chat (IRC) channel over TCP Port 6667, or it could be used to send spam email or participate in Distributed Denial-of-Service (DDoS) attacks. Botnets can be coded Native Code or Managed Code with Assembly (Flat Assembler (FASM), Netwide Assembler (NASM) and sometimes Borland Turbo Assembler (TASM) or Microsoft Macro Assembler (MASM), Borland C/C++ and Delphi, GCC with MinGW or Free Pascal, Microsoft C#/VB.NET as such native compiler base coding languages and Python/Django, PHP, ASP.NET, ASP, Perl, Ruby on Rails and TCL as such host oriented scriptable programming languages. In the Botnet the Bots can be deployed in numerous diverse geographical locations in the world.
And sometimes the Botnets can be a mixture of two types (Ex. Botnet Control Console can be coded with Native Code or Managed Code (it is the most efficient and recommended way) and Bots can be coded as 'Scriptable Code' (it is also the most effective, efficient and recommended way, for the reason that we can host the Bots in the Web Servers and it is a hassle free method for the rooting/installing the Bots). In contemporary days bots can be hosted inside the numerous numbers of free web hosting services in the world.

Illustration 3.28: Bots and Botnets Legal Bots/Botnets
The term botnet is widely used when several IRC bots have been linked and may possibly set channel modes on other bots and users while keeping IRC channels free from unwanted users. This is where the term is originally from, since the first illegal botnets were similar to legal botnets. A common bot used to set up botnets on IRC is Egg-Drop. The popular email service providers are also embedded with such legal Bots (Ex. Google Gmail Bot, Microsoft Live Mass Mail Bot, and Facebook Social Media Bot etc.). Legal botnets are sometimes help us prevent from cybercrimes. Sometimes these bots helping us to proceed the SEO (Search Engine Optimization).

Illustration 3.29: Bots and Botnets Structured Stack Key Rudiments of Botnet C&C Server (Command and Control Server)
The IRC Server or this server is known as the command-and-control (C&C) server. Though rare, more experienced botnet operators program command protocols from scratch. Command & Control Server is mostly identified as centralized servers. Over these servers malicious Bot-Herders/Bot-Masters can be commanded or receive the results of Bots. And C&C is the major segment in the Botnet. For the reason that every Botnet IRC server command are transferring through the Command & Controller channels. However if someone wants to launch the DDoS (Distributed Denial of Service) attack, the Bot-Herder/Bot-Master will send the pre-coded chat based IRC command over the C&C server to the correspondent Bots planted in the specific Botnet. Any Bot in the specific Botnet will obey the pre-coded command and it will surely perform the specific given task as precisely without having any hesitation. Botnet C&C servers has four tiered stacks such as Star, Multi-Server, Hierarchical, and Random.
Internet Relay Chat (IRC) Server/Channel is major command transferring server. It is also based on Client/Server mechanism. IRC channel mostly developed like a software and it can be installed on the private hosting service providers as well as public hosting providers. The IRC clients always use to send/receive messages with the other clients. The client software can be accessible for each and every major operating systems which support to the Internet access. By way of April 2011, the topmost 100 IRC networks assisted more than half a million users at a time, with hundreds of thousands of channels operating on a total of approximately 1,500 servers out of evenly 3,200 servers worldwide. Subsequently IRC channel using TCP (Transmission Control Protocol) and optionally secured TLS (Transport Layer Security) Protocols to sending data for others. There are several client enactments, such as mIRC, XChat and irssi, and IRC server implementations. Some people use custom coded IRC servers and clients for the manual IRC communication over the IRC channels for their own determinations. IRC existed formerly a plain text protocol, in later times IANA (Internet Assigned Numbers Authority) it has been reallocated to communicate over the assigned port 194/TCP. However, the 'de facto standard (custom, convention, product, or system that has accomplished a dominant position by public approval or market forces)' has always been to run IRC on 6667/TCP and nearby port numbers (TCP ports 6660'6669, 7000) to escape having to run the IRCd (Internet Relay Chat Daemon) software with root privileges on the UNIX based operating systems.

' BOT Herder/BOT Master
The botnet's originator or Attacker and mostly sitting in a distance land mask or different geographical location. Zombies/Bots
A computer connected to the Internet that has been compromised by a hacker, often can be used to perform malicious tasks of one sort or another under remote direction. Typically runs hidden and uses a covert channel (Ex. the RFC 1459 (IRC) standard, or IM) to communicate with its C&C server. Supplementary Discreet and Profound Study on Botnets
Most of the times Botnets observing their own characteristic approach for their malicious activities such as DoS (Denial of Service) Attacks and DDoS (Distributed Denial of Service). Bots can be installed in manual approach or can be fully automated over the online resources by using APT (Advanced Persistence Threats over the Spear Phishing) and sometime bots can be installed into the native operating systems as well as some botnets can be hosted in the remote server environment (Ex. Cloud Host). Some contemporary Botnets/Bots can be hosted in the Virtual Machine (VM) environment. Some hackers are installing Bot in their own custom made procedures and they always use their very unique and advanced practices. As an Example of popular Botnets in the wild are Zeus, Zeronet, Donbot, Mega-D, Low Sec and etc. also these are most powerful botnets in the world having the power of Iron Canons.

' Focused Defence Mechanisms for Botnets
To deploy Bots to the numerous Operating Systems hackers or malicious people use different mechanisms to evade from the default security by applying anti anti-malware techniques. Following methods are mostly focused techniques to perform these types of tasks to make the successful Anti-Malware evasion. In this section we are going to discuss the techniques of the Bots upon the installation into the Operating System. PE-Joiner and PE-Binders
PE-Joiner/PE-Binders are small programs that allows to easily join (bind) two or more files into a single executable. That executable (the one into which the files are included ' career/stub) is a simple compiled program that, when it opened, will automatically launch the included files one by one. Mostly in here we are able to run any type of file from the joined file without affecting the key file functionality. Focused key features are, can join any file type that is needed by the executable for a properly run, unlimited files can be bond, options to encrypt the bonded files, some option to change the icon of the bonded file and the joiner "stub" is undetectable when scanned with major antivirus products and the key purposes of the binders would be, to distribute your program as a single executable, to hide files, to open/run more files from the same program/executable.
Sometime we call this operation as EXE Bundling. It bundles application files (sometimes non-executable or any sort of data/binary or non-binary file formats) into a single package and enables you to use it as a file splitter. The some options are there to Add File(s) in the bundling helps you to add your executable or source files (source files extract, EXE/SCR start direct in memory) that you would like to join. Then, by simply clicking the Bundle option button all the files selected for joining are consolidated into a single file. Then the stub acts as an exe loader and exe tool as it is so simple that it is just enough to start only one executable file. The compiler executes the executable file started and the rest of the application files get started automatically in the background from the embedded files by the compiler. Bundlers having mostly an option can be used to add icon files also in addition to executable and source files. Some options proves to be very interesting. In fact, by installing bundlers you can also join DLL files. By doing this, you will not have the possibility of getting the error message "DLL not found" anymore which makes the bundler application a favourite among users. Some advanced bundlers also acts as PE-packers/cryptors as it creates a pure PE-Packed/Crypted file that can be protected or compressed with other PE loader and PE tools. Sometimes in the binders you can use the test option which helps test the bundled joint application files. You can also use all the resources of binders explained above for applications stored as project. A higher compression ratio allows you to get smaller compressed files. So if you have more files for joining, then you have option named "Compress high ratio". To evade from the Anti-Malware you can join files in higher volumes. Some binders thus acts as a one source file splitter joiner which, if installed in your system, makes the tasks of file joining and file splitting simple and effective. Binders allows you to easily join together and split apart files of all kinds. PE-Joinders can be used across all platforms and they are not only acts as file splitter joiner, but also acts as a tool that protects your files efficiently. Having got a detailed idea on the benefits and the working methodology of PE-Joinder, you can now install the stub and enjoy the features of file splitter, file joiner and compressor all in one single application.

Illustration 3.30: Advanced PE-Binder Process Stack
In such manner building bonded packages (bind invasive malware bots) in to the software cave spaces is really effective manner to evasion form the Anti-Malware programs. Process Hijacking and Process Hijackers
An invader gains control of a process that is assigned elevated privileges in order to execute arbitrary code with those privileges. Certain processes are assigned elevated privileges on an operating systems, frequently through association with a particular user, group, or role. As a classic case study, this is fairly common on older or very well used in PCs, but we should begin by explaining what explorer.exe/svchost.exe processes does. Windows depends on lots of small 'helper' programs or processes that run in the background; in turn these rely on shared files called dynamic link libraries or dlls. The svchost.exe processes' job is to manage running Services, by organising them into groups that do similar things or share the same dlls. By the sound of it one of your svchost processes is working overtime and this may be due to redundant Services, a corrupt Registry entries or in rare cases, a virus or malware infection. If an attacker can hijack one of these process, they will be able to assume its level of privilege in order to execute their own code. Processes can be hijacked through improper handling of user input (for example, a buffer overflow or certain types of injection attacks) or by utilizing system utilities that support process control that have been inadequately secured.

Illustration 3.31: Advanced Process Hijacking in Windows Environment Seek and Self-Destroying (Melting) after the Process Hijack
When the Crypted/Packed Bot is executed, the physical file itself melts and deletes itself from the machine. Uninstall programs typically can easily delete themselves at the end of the un-installation, sometime surely there are other good reasons for self-deleting executable but then again un-installation is probably the most common. Nevertheless executable cannot delete itself by simply by calling the DeleteFile() function. By calling the Selfdestruct() function before program exit, the calling executable will be destroyed as soon as possible. The neat thing about this technique is that the spawned child process never really executes (no windows ever appear) - only the "self-deleting" code in the child gets to execute. After the parent process terminates and is deleted, the child process terminates using ExitProcess(), preventing anything further from happening. Effectively an arbitrary child-process is hijacked and coerced into deleting its parent. This technique used CreateRemoteThread to inject the code into the suspended child process. This was quite neat because the child never ran - only the injected thread with the self-deleting code ever executed. The downside to this technique is that it only ran on Windows-NT based platforms. Sometimes some codes does not use CreateRemoteThread to inject code into the child. It therefore works on all versions of Windows.

The self-deleting code is injected into the spawned child by hijacking the primary thread of the process. The thread's stack is manipulated in such a way so that space is "allocated" on the stack - into which the self-deleting code is written. The instruction-pointer for the thread is then altered so that it points to the injected code on the thread's stack. When the child process (and primary thread) is resumed the code executes right off the stack, deleting the parent process, and then exits. There are some noted issues involved with this last technique. The first is the injection of code into an already running thread - this is not a desirable thing to do, but because the thread's "real" execution is never resumed (the process exits as soon as the parent is deleted) it seems like a viable way to do this. The second issue involves the Win32 environment. When a child process is created in a suspended state the program has not started executing yet - there is still a lot of Win32 environment setup that must be performed before the entry-point to the executable is called. So when we hijack this thread to do our bidding, we are doing so in an environment that is not yet fully initialized, and is not ready for Win32 API calls (it is still mid-way through executing some internal part of the OS). The API calls that are made (WaitForSingleObject, DeleteFile etc.) do however work - but this is not guaranteed in future OS releases. The exact same issue exists when you CreateRemoteThread into a process that was started as suspended - the OS hasn't yet finished initializing your process's environment before the remote thread starts to execute. This is why a lot of Win32 API calls will fail (such as Remove Directory).

Illustration 3.32: Basic Melting and Execution of PE File
3.3.13. Background Operation and Process
As soon as we upload the highly compressed/encrypted file archives to the various web folders in Cloud, the Bot-Herder or Bot-Master (Bot Administrator) will send a broadcasted the control command over the C&C (Command and Control) Server (which is IRC server) to execute the pre-assigned task. Then all the Bots in the Botnet will be authenticated their built-in authentication hashed 'User Names' and 'Pass Codes' with the Authentication Database Server and upon the successful authentication they will be automatically redirected to the Cloud protected folders by providing the genuine remote authentication codes which was verified by the Authentication Database Server.

Then the Bots will start/initiate the appropriate download process of the compressed/encrypted files to their local spaces and upon the successful process execution, the Cloud protected folders will be zapped/erased or deflated using secure wiping algorithm (Ex. DoD 7, Peter Guttmann 35 etc.) to preserve its security and then Bots calculates the independent file integrity hashes by using the MD5/SHA1 algorithms. After that the pre-computed independent hash values will be compared with the Hash Basket Server's specific entries to ensure that there is no binary file tampering/corruption happened during the specific grace period. Upon the successful comparison, the decompression/decryption process also will be started as an automated sub-routine of the Botnet.

3.3.14. Botnet is in Action
Then another automated sub-routine will be automatically triggered upon the last process succession. Bots will load the decompressed/decrypted files as a sequential buffered flow and initiates the 'Reverse Engineered Assembly' process (which means we convert the Binary files in to Assembly Language oriented codes) by using the Debug processing of the binaries (these Assembly Language oriented codes cannot be recompiled as original binaries in any manner).

Then all blended Assembly Language oriented codes will be saved as the separate text files with the file extension of (.ASM) and after that the Bots will be re-Compressed/re-Crypted the all the source file sets in the archives like the same as before by using designated algorithms as well as the integrity control also will be placed as same as before to preserve the utmost conceivable security and integrity and the newly generated archives will be uploaded back to the Cloud protected folders upon the re-authentication with the Authentication Server and file hashes will also be uploaded to the Hash Basket Server (Integrity Server) back.

3.3.15. Code Audit
Then the Bot-Herder/Bot-Master will be informed to the Code Auditors to start their Audit process of the codes. Each code auditor will be given the 2 numbers of computers. The first is a PC (without a keyboard and mouse (all other USB or supplementary ports are being disabled), but the screen/monitor is fully Touch sensitive which is projecting/showing the Read-Only view (by using PHP, Python, HTML5, JS, CSS3, AJAX and JQUERY based web application (the Right-Click menus are fully disabled/only browser scroll bars preserves on the screen) of the Bot converted Assembly Source codes with the online syntax highlighting code view and software flow view (including Conditional and Unconditional Jumps/Junctions in Assembly) over the special web based full screen (kiosk mode) Metro App (used in Microsoft Windows 8) or Android/Apple HTML5/CSS3 based App).

This web based software will contact each Cloud protected folder after the successful user credentials authentication with the Authentication Server as well as the comparison of all integrity hashes with the Hash Basket Server (Integrity Server) and load all Assembly Source codes to the aforesaid specific host web application over the covert SSL/SSH (fully encrypted ' 1024 bits encrypted) based or VNC or Team Viewer RAT (tunnel) based ultra-secured connections or communication channels.

Code auditors also given the laptop computer to prepare a complete report which is connected to the corporate LAN and to the reporting server. Based on the source code audit view, auditors should carefully analysed the source code and should verify that there is no backdoor access is coded in the software where the company is going to purchase. There are two independent auditors will be review the coded twice and those code auditors will be located in physically separated floors in the building.

Upon their specific blended reports, the conclusions of the independent source code reviews will be reviewed again by the Chief Code Auditor for the concluding authorization. Only upon the specific and precise authorization of the Chief Code Auditor, the company is going to purchase the software for their internal premises to accomplish the maximum possible security by evading the illegal backdoor accesses given by the major software vendors to the resilient governing realms.

Illustration 4.1: Enhanced Conceptual Tactical Model Process

Illustration 4.2: Enhanced Initial Conceptual Process Flow

Illustration 4.3: Conceptual Model Enhanced Process Flow Diagram

4.1. Complete Process Overview in a Nutshell and Philosophical Approach
In accordance to the above projected design, it will describe the entire projected process flow of the revolutionize concept. In the information gathering chapter's final projection would be, no cure or no solutions for such illegal activities governing by authoritative dominant international intelligence government agencies such as CIA, NSA, FBI, etc... Till now there is no such security mechanism designed or presented to prevent such illegal activities from the information security perspective. There might be another reason behind this would be the most of the developed countries are supporting such illegal activities silently. In the contemporary news reflects that the countries who support these types of secret surveillance backdoor operations to spy (espionage) their own nation. Unfortunately the civil community/society seems to be not strong enough to fight/defeat or to oppose (defend) these types of secret missions initiated by global governing authorities due to the process failures of global legal frameworks or authoritative parties are challenging or trying to forcefully override the current global legal infrastructures or the legal frameworks trying to hide such secret government activities by changing major/minor legal closes of the global legal frameworks. Most of the powerful country's civilian forces always protests to win their privacy rights against such violations including hacker communities such as worldwide illegal cyber movements (Ex: LulzSec, Anonymous etc.) are always trying to forcefully stop such illegal authoritative governing activities. But yet the governing authorities never surrender and they never back off in front of such civilian forces or illegal cyber movements. Even though the WikiLeaks and whistle-blower Edward Snowden exposed huge number of illegal government violations including NSA, CIA, FBI, etc., these movements still running authoritatively on under the solid state authority of the global governments.

By accumulating all these facts, leads to a solid and precise argument which is the global civilian force protests, illegal cyber forces movements, etc. against such governing movements fails in front of the authoritative power of the governing bodies in the world. On such situation, a third world country like Sri Lanka will be surely vanished in the valley of the shadow death. But in ancient world, the Sri Lanka many times ruled the entire Asia Pacific Region (Ex: King Ravana, King Gajaba, King Dutugamunu, etc.). It reflects the power of the nation of the Sri Lanka. Once again the time has come to show the power of Sri Lankan brains to the whole world. In such manner I revolutionize this new-fangled concept to successfully prevention (not to trap inside the bird cage) from such authoritative global secret backdooring and spying movements.

This entire project based on the theory of the 'prevention is better than cure'. When a company wants to purchase new software or the military wants to purchase software for their internal work process, this would be a phenomenon and a vital solution.

The software which we select to implement in our organization might be a commercial-ware or freeware (these type of software not providing the source code). If it is a commercial-ware there will be surely the trial version (trial-ware) available for trial run for their customers. Initially we will download the trial software or freeware from the vendor/author's website (as a best practise we will never download the setup file from the unknown third party sources). After the download process of selected software, we need to check whether the installation setup file and the related package for known and/or unknown malware inflections. To test malware inflections of the particular installation package, we need to analyse the downloaded file inside the Sheep-Dip environment (the Sheep-Dip environment comprises of heterogeneous anti-malware vendor products). Under the Sheep-Dip environment initially we test the installation package for Signature based malware detection. If it is no malware is detected, you can proceed for the further steps. If the setup package is identified as malware inflected, then we need to check whether the malware is known or unknown. If it is known malware we generate the report by including the malware name and we recommend/emphasis to the management not to purchase this software.

If it is unknown malware is detected by the anti-malware then by using Heuristic detection method, we try to identify the category and/or type of the inflected malware. If it is identified, we generate the report with the malware type, category, behaviour and emphasis the management not to purchase this software for the internal purpose. If it is failed to detect by Heuristic scan, then we parse the file to detect over the Blood-Hound scan. If there is inflection found with the Blood Hound scanning method, then we prepare the report with the malware type, category, behaviour and emphasis the management not to buy this software. If there is no malware found then you can proceed for the further steps without no complications.

Then the setup package will be loaded in to the sandbox environment to install it virtually in to the operating system. Upon the successful installation in to the sandbox, we will collect all virtually installed files of the package from the sandbox virtual vault. Now it is the time to check whether the installed files for known and/or unknown malware inflections. To test malware inflections of the particular installed files, we need to analyse the installed files inside the Sheep-Dip environment. Initially we test the installed files for Signature based malware detection. If it is no malware is detected, you can proceed for the further steps. If the installed files are identified as malware inflected, then we need to check whether the malware is known or unknown. If it is known malware we generate the report by including the malware name and we recommend/emphasis to the management not to purchase this software. If it is unknown malware is detected by the anti-malware then by using Heuristic detection method, we try to identify the category and/or type of the inflected malware. If it is identified, we generate the report with the malware type, category, behaviour and emphasis the management not to purchase this software for the internal purpose. If it is failed to detect by Heuristic scan, then we parse the files to detect over the Blood-Hound scan. If there is inflection found with the Blood Hound scanning method, then we prepare the report with the malware type, category, behaviour and emphasis the management not to buy this software. If there is no malware found then you can proceed for the further steps without no complications.
After the successful malware detection of the installed files, it is time to analyse those binary files to check whether there are packed with PE-Packers and Crypted with the PE-Cryptors. This entire process is a manual process. This process practically very hard to transform into the fully automated process due to the huge number of various packers and cryptors availability in the world.

In this process initially the specialists will check the binary files for identification of PE-Cryptors using various PE-Crypt detection mechanisms. If one or several files found with the PE-Crypted stub, then they will check whether the crypted file/files for platform bit compatibility (x86(32bits)/x64(64bits)). Upon the identification of platform level bits, they will decrypt (PE-Decrypt) the entire executable file to gain back to its original state. Upon the successful decryption of the PE file, now it is time to check the file's platform validity (in this process we will check whether the decrypted file still can be executed on the operating system. This process also known as PE Virginity check/runtime check). If the file is runnable on the platform, you can process for the next step. If the file is invalid, then we need to rebuild/fix the entire executable file with the manual methods/tools (PE-Fix Tools) (this process is also known as PE-Revirgin). After the successful fixation, we need to re-check the entire fixed/rebuilt file for PE validation. This cycling process should be continued till the file is being validated and the resulting file should be somehow successfully execute on the platform. If the file is non-encrypted you can directly process through the next step.

In this procedure initially the specialists will check the binary files for identification of PE-Packers using numerous PE-Packer detection techniques. If one or several files found with the PE-packed state, then they will check whether the packed file/files for platform bit compatibility (x86(32bits)/x64(64bits)). Upon the identification of platform level bits, they will unpack (PE-Unpack) the entire executable file to gain back to its original state. Upon the successful unpacking of the PE file, it is time to check the file's platform validity (in this process we will check whether the unpacked file still can be executed on the operating system. If the file is runnable on the platform, you can process for the next step. If the file is invalid, then we need to rebuild/fix the entire executable file with the manual methods/tools (PE-Fix Tools) or also known as PE-Revirgin. After the successful fixation, we need to re-check the entire fixed/rebuilt file for PE validation. This cycling process should be continued till the file is being validated and the resulting file should be somehow successfully execute on the platform. If the file is non-packed you can directly process through the next step.

Now it is time to send those original state binary files to the uploader to upload. In the previous sections we mentioned that the binary file is being checked for platform bit compatibility (x86(32bits)/x64(64bits)). At the same time the specialists on those pools will also categories the resulted binary files in accordance to the classification of platform bit compatibility. These resulted binary files will also be saved mainly on two different file folders (folder 1 = x86 and folder 2 = x64). Then the uploader will upload those classified files into two different cloud vendor's storage locations. These cloud storage locations initially will be created for file upload purpose (Ex. Google Drive and Microsoft SkyDrive). Upon the successful file upload process, the bot-herder (bot master) will be received a notification regarding the file upload completion. If the file upload fails the uploader should be continuously give his best effort till the files are been successfully upload into the cloud storages.

Now it is time to arm the virtually hosted botnets. For this process the virtually hosted botnets is having two categories upon the platform bit compatibility which means on botnet dealing with the 32bits (x86) binary files and the other botnet dealing with the 64bits (x64). These botnets are specially coded in accordance to the platform bits compatibility, which means in the bots in 32bits botnet are being coded with the 32bits C++ and the bots in 64bits botnet are being coded with 64bits C++ (in the fact that the 64bits botnets are not being introduced to the world yet or it might be under the experimental stage). The reason is to developed the 64bit botnets, the 32bits (x86) decompilers/disassemblers cannot decompile/disassemble the 64bits (x64) binary files. Also 64bits disassemblers and decompilers may not be able to run on 32bits (x86) operating system. Therefore we must need to build the 64bits botnet with the compilation of 64bits bots with the feature of 64bit decompilation and disassembling.

One the bot-hearder receive a notification from the uploader regarding the successfully file upload as well as he will be also notified with the platform bit classification (the notification will include whether uploaded files are categorised into 32bits 64bits or mixture of those platform bits). At the event of this, the bot-herder will command the bots in botnet to initiate their primary task which is download the uploaded files from the cloud storages to there are appropriate bot's local storages to execute the secondary protocol which mean disassembling and decompiling the binary files into disassembly code view this bot command will be transfer through Command and Control Server (IRC server) which is globally located and entire bot herder to bots communication will be passed through encrypted TCP port 6667 or a custom chat port. Which is allocated for IRC communication over the Command and Control IRC server.

Upon the bot-herder commands bots on those botnets will download all corresponding uploaded binary files from the cloud storages to their local storages. Those bots are being installed in the various geographical locations in the world (Ex. Test PCs, test labs and home PCs, etc.). Then there local storage would be the installed PC from the different geographical location in the world. One the binary files downloaded to the bots local storages, the bots are automatically being triggered the next tasks of the main operation which is known as decompiling/disassembling of the downloaded binary files. Science this operations is an entirely automated process, the all uploaded binaries will be disassemblers/decompiled with in the very short period of time. After the decompilation and disassembling of the appropriate binary files, the resulted disassembly text based code file will be instantly and automatically being uploaded to the different vendor to cloud storages based on the platform bit classifications. Once it is being uploaded the disassembly code files to the cloud storages the original locally remained copy of the disassembly code files, downloaded binary source files is being deleted using secured wipe mechanisms. (Ex. DoD7, Peter Guttmann 35 etc.) Algorithms. An also the original cloud storages (original binary files which are being initially uploaded by the uploader) will also be wiped out instantly storage. All the generated and uploaded disassembly code files carrying the binary files original name size etc. as such PE-Architectural Information.

Now it is time to describe the final stage of the project, it is again an entirely manual process which is known as code analysis and code auditing. The code analyser's/code auditor's pool sitting inside the organization. The auditing team/ analysing team will remotely read the disassembly codes of the entire package where the organization is going to purchase of their internal workflow. The team will create the secured (highly encrypted) VNC (virtual network computing) covered channel to the appropriate cloud storage to see the code view of disassembly coded files. This team can only view decode and they cannot download or cannot modified (tamper) any of the disassembly coded files in the cloud storages. Then they might need to analyse/code audit entire disassembly coded files for backdoors, rootkits, spyware or another form of malware which might be appliers on the code view. Based on the entire analysis the entire team will generate the final report for the management prospective for the final decision making and they will give the entire overview of the technical classification of the entire package as well as management review and recommendations. Based on the final report management easily make their decision (Ex. If the code audit team found a malware source anyway in the code in the package management will surely decide not to purchase this software package for their internal workflow and if it is not found a form of malware they can surely be purchased the appropriate software without having any doubt).
5.1. Introduction
Software testing is a method of assessing the functionality of a software program. There are many different types of software testing but the two main categories are dynamic testing and static testing. Dynamic testing is an assessment that is conducted while the program is executed; static testing, on the other hand, is an examination of the program's code and associated documentation. Dynamic and static methods are often used together.

Illustration 5.1: Software Testing Life Cycle

Illustration 5.2: Software Testing Types

Illustration 5.3: Defect Tracking

Illustration 5.4: Bug Hunting

Illustration 5.5: Agile Testing
5.2. Black-Box Testing
A black box is any device whose workings are not understood by or accessible to its user. According to Edward Tenner, writing in The Washington Post, the first black box was a gun sight carried on World War II Flying Fortresses, with hidden components that corrected for environmental variables, such as wind speed. The crew probably didn't know how the device worked, but they knew it might be crucial to their survival. Nowadays, there are two types of black box carried on aircraft, which may be combined into a single device: a Flight Data Recorder (FDR), which logs information such as speed and altitude, and a Cockpit Voice Recorder (CVR), which logs all voice communication in the cockpit. These black boxes also carry beacons to help find the aircraft in a rescue situation.

In telecommunications, a black box is a resistor connected to a phone line that makes it impossible for the telephone company's equipment to detect when a call has been answered. In data mining, a black box is an algorithm or a technology that doesn't provide an explanation of how it works. In software development, a black box is a testing method in which the tester has no knowledge of the inner workings of the program being tested. The tester might know what input is and what the expected outcome is, but not how the results are achieved. A black box component is a compiled program that is protected from alteration by ensuring that a programmer can only access it through an exposed interface. In film-making, a black box is a dedicated hardware device: equipment that is specifically used for a particular function. In the theatre and television, a black box is an unfurnished studio. In the financial world, a black box is a computerized trading system that does not make its rules easily available. Perhaps because the metaphor is broadly applicable, black box is sometimes used to refer to anything that works without its inner workings being understood or accessible for understanding.

Illustration 5.6: Black-Box Testing

Illustration 5.7: Black-Box Regression Testing

5.3. Fuzz Testing
Fuzz testing or fuzzing is a software testing technique used to discover coding errors and security loopholes in software, operating systems or networks by inputting massive amounts of random data, called fuzz, to the system in an attempt to make it crash. If a vulnerability is found, a tool called a fuzz tester (or fuzzer), indicates potential causes. Fuzz testing was originally developed by Barton Miller at the University of Wisconsin in 1989. Fuzzers work best for problems that can cause a program to crash, such as buffer overflow, cross-site scripting, denial of service attacks, format bugs and SQL injection. These schemes are often used by malicious hacker's intent on wreaking the greatest possible amount of havoc in the least possible time. Fuzz testing is less effective for dealing with security threats that do not cause program crashes, such as spyware, some viruses, worms, Trojans and keyloggers.
Fuzz testing is simple and offers a high benefit-to-cost ratio. Fuzz testing can often reveal defects that are overlooked when software is written and debugged. Nevertheless, fuzz testing usually finds only the most serious faults. Fuzz testing alone cannot provide a complete picture of the overall security, quality or effectiveness of a program in a particular situation or application. Fuzzers are most effective when used in conjunction with extensive black box testing, beta testing and other proven debugging methods.

Illustration 5.8: Fuzzing Life Cycle

Illustration 5.9: Black-Box Fuzzing

Illustration 5.10: Fuzzing the System

5.4. Software Quality Assurance and Quality Control with Selenium
SQA helps ensure the development of high-quality software. SQA practices are implemented in most types of software development, regardless of the underlying software development model being used. In a broader sense, SQA incorporates and implements software testing methodologies to test software. Rather than checking for quality after completion, SQA processes tests for quality in each phase of development until the software is complete. With SQA, the software development process moves into the next phase only once the current/previous phase complies with the required quality standards. SQA generally works on one or more industry standards that help in building software quality guidelines and implementation strategies. These standards include the ISO 9000 and capability maturity model integration (CMMI).

Illustration 5.11: Quality Assurance Life Cycle
Selenium is a set of different software tools each with a different approach to supporting test automation. Most Selenium QA Engineers focus on the one or two tools that most meet the needs of their project, however learning all the tools will give you many different options for approaching different test automation problems. The entire suite of tools results in a rich set of testing functions specifically geared to the needs of testing of web applications of all types. These operations are highly flexible, allowing many options for locating UI elements and comparing expected test results against actual application behaviour. One of Selenium's key features is the support for executing one's tests on multiple browser platforms.

Illustration 5.12: Quality Assurance Life Cycle ' Advanced

This critical determination of this project to generally help for the people and essentially I focused for the Business Cultural Information Security Department, military forces Information Security Force and Police Information Security Department also. Because of lots of secret data sharing over computer based environment as well as internet based too. It is one of my prime concern to initiate this project. In my project I tried to think how to solve every legal problem is in the specific legal framework more effectively. I somehow got success with in the successful legal framework. In this project, I followed the EULA of most software applications and DCMA act for study well for global and domestic intellectual property law. Because I am decompiling to show the source code to check the vulnerabilities, backdoors of the binaries. That's another problem to how to do it effectively. Then I think about that and I click to my mind it can easy to do it using botnet. Because who never knows to how is this compiled and how to get this for compiling. Then I will start to do this project and I never exposed to sauce code in automatically way. In this I said basic start and this can be more such improvements to develop with using Artificial Intelligent (AI) knowledge. It is future aspects.

I never worried to be said this is the first method and I copyright my software and concept under the law and I have privileges to handle each and every thing for my own situation. This is one and only system and provide to say this is Sri Lankan new innovation. This framework can identified someone try to track us, tap us, backdooring and spying etc.' that's is the evaluation for my concerning view.

I can warranty in this software never backdooring you or like such an unethical things to capturing anymore.


' This project has the capability as well as the capacity to evolve with the future prospective and wants of the prevent human cheating on information in order to full fill them so further improvements to the system have designated below.

' The information tools can be improve for generating with more sensitive report data using future improvements.

' We can join AI (Artificial Intelligent) technology in this project half on bots. Because of we can reduce lots of time to generating the decompiling code.

' Introduce military forces and get simple survey for using this. After that we can improve any such good areas with those survey.

' Enlightening 64bits Bots for using to decompiling 64bits software.


Throughout the course of this project a lot of useful and interesting goals were achieved. Especially all the initial project's goals were achieved. A few of the other major achievements can be listed as software binary file creating, decrypting, parsing values, connecting botnet with bots and finally output giving to the via VNC viewer were a challenging task and that challenge was met by implementing the system effectively the system successfully.

The project was carried out as a partial fulfilment of the requirement of the Degree of SCHOOL OF COMPUTING, UNIVERSITY OF TEESIDE MIDDLESBROUGH.
This appendix comprises biographic citations for the specific subject matters comprising identified cyber defence, outbreak and associated literature. This set of literatures were used for numerous analyses within the contextual arena of this entire thesis. The certain bibliographic citations of the specified documents castoff for the cyber outbreak and associated terminology analysis and rest of the documents were castoff for the cyber defence and associated terminology analysis.

Please note that most of the stated (cited) sensitive/vital documents and resources were extracted from the published online dumps and leaks of Email and Web archives. And the aforesaid dumps/leaks of Email and Web archives published in 'The Pirate Bay' and supplementary Web and Email archives as part of the 'Operation Payback', 'Operation Tango Down' and 'Operation Anti-Sec' by the 'Lulzsec' and 'Anonymous' Hacktivists. There are no trusted information sources has been officially published in the Internet other than such illegal leaks via 'WikiLeaks', 'Cryptome Leaks', 'Prism Leaks', underground websites, underground (hackers) Internet Relay Chat channels (IRC), online hack forums, online hacking associated discussion boards, online Paste-Bins, social networks and mostly P2P networks such as the 'Pirate Bay' Torrents.

The references that have been specified in this project dissertation might be illegal and private (confidential) references from specific individual's point of interpretation or view and they might heavily criticize that it is illegal to acquire the confidential information from the private sources. Then again, in this case every source of information that individuals think that I have acquired from the secretive private sources are already published in the public domain of WWW (Ex. The Cryptome, The WikiLeaks, The Prism Leaks etc...). In my rock-solid and robust point of sight I firmly argue that to attack back for their senseless liquidize criticisms with the identical gravity. In such manner to evidence it I solidly arguing back with them, I have not committed whichever genus of illegal information gathering (reconnaissance) by whichever form of illegal penetration to any of the aforesaid private sources. (Ex. ManTech Global Inc. and HBGary Inc. (FBI's Federal Defence Contractors) has been taken over or hacked by Anonymous hacktivists and published their explicitly sensitive and confidential data/information in The Pirate Bay (dumps/leaks)).


1. Alex Huth and James Cebula, (2011), The Basic of Cloud Computing, US-CERT

2. Peter Mell and Timothy Grance, (2011), The NIST Defenition of Cloud Computing, NIST Special Publication 800-145

3. Markus F.X.J. Oberhumer, Laszlo Molnar & John F. Reiser, (2013), UPX (Ultimate Packer for Executables), and Available:, Last accessed 1th April 2014.

4. Peter Ferrie, (2010), Anti-Unpacker Tricks, Microsoft Co-operation.

5. Eric Geier, (2012), How to Keep Your PC Safe With Sandboxing, Available: Last accessed 31st March 2014.

6. Dr Mafaz Mohism Khalil Al-Anezi, (2014), IJACSA (International Jurnal of Advance Computer Science Application), Available: Last accessed 31st March 2014.

7. Eldad Eilam, (2005), Foundations. In: Eldad Eilam, Elliot Chikofsky Reversing: Secrets of Reverse Engineering, Indiana: Wiley Publishing, Inc., ISBN-10: 0-7645-7481-7, ISBN-13: 978-0-7645-7481-8, 3-9.

8. Cory Janssen, (2014), Reverse Engineering. Available: Last accessed 30th March 2014.

9. Greg Hoglund and Gary McGraw, (2004), Reverse Engineering and Program Understanding, Available:, Last accessed 1st April 2014.

10. Rik Ferguson, (2010), The Botnet Chronicles, Trend Micro Inc., 4

11. Margaret Rouse, (2012), Botnet (Zombie Army), Available:, Last accessed 31st March 2014.

12. Luke Harding, (2014), Edward Snowden:US government Spyied on Human Rights workers, Available: , Last accessed 8th April 2014.

13. Glenn Greenwald and Ewen macAskill, (2013), NSA PRISM Programe taps in to user data of Apple, Google and others, Available: , Last accessed 31st March 2014.

14. The guardian, (2013), Project BULLRUN ' classification guide to the NSA's decryption program, Available: , Last accessed 25th March 2014.

15. Mohit Kumar, (2013), NSA can eavesdrop traffic in Real time, more PRISM slides leaked, Available: , Last accessed 25th March 2014.

16. Edward Snowden, (2013), NSA Files: Verizon, PRISM, Boundless Informant, FISA 702, XKEYSCORE (+ Whistle-blower Edward Snowden Interview), Available: , Last accessed 25th March 2014.
17. Chris Civil, Andrew Crocker And Mark M. Jaycox, (2013), Introducing a Compendium of the Released NSA Spying Documents, Available: , Last accessed 25th March 2014.

18. Leaked by Cryptome, (2014), Available: , Last accessed 1th April 2014.

19. Fort George G. Meade, 2013, National Seurity Agancy, Available: , Last accessed 31th April 2013.

20. Julian Assange, 2014, OR Books announces a major new book with Julian Assange, Available: last accessed 2nd April 2014.

21. Margaret Rouse, 2008, black box (black box testing), Available: Last accessed 31st March 2014.

22. Margaret Rouse, 2010, fuzz testing (fuzzing), Available: Last accessed 31st March 2014.

23. Cory Janssen, 2014, Software Quality Assurance (SQA), Available: Last accessed 31st March 2014.

Source: Essay UK -

About this resource

This Information Technology essay was submitted to us by a student in order to help you with your studies.

Search our content:

  • Download this page
  • Print this page
  • Search again

  • Word count:

    This page has approximately words.



    If you use part of this page in your own work, you need to provide a citation, as follows:

    Essay UK, The Cloud And Remote Agent Based Code Auditing And Reporting Framework. Available from: <> [27-05-20].

    More information:

    If you are the original author of this content and no longer wish to have it published on our website then please click on the link below to request removal: