SecConcept - Secure Coding Review and Analysis
Secure Coding Review and Analysis
- applications require a last look to ensure that the application and its components, are free of security flaws.
secure code review
- serves to detect all the inconsistencies that weren’t found in other types of security testing
- to ensure the application’s logic and business code is sound.
Reviews can be done via both manual and automated methods
- focus on finding flaws
in areas
- Authentication, authorization, security configuration,
- session management, logging,
- data validation, error handling, and encryption.
- Code reviewers should be well-versed in the language of the application they’re testing
- knowledgeable on the secure coding practices and security controls that they need to be looking out for.
- need to understand the full context of the application ,
- including its intended audience and use cases
- Without that context, code may look secure at first glance, but easily be attacked.
- Knowing the context by which an app is going to be used and how it will function is the only way to certify that the code adequately protects whatever you’ve relegating to it.
benefits
- cut down on time and resources it would take if vulnerabilities were detected after release
- The security bugs being looked for during a secure code review can cause countless breaches , resulted in billions of dollars in lost revenue, fines, and abandoned customers.
- Static Code Analysis
- Advantages of Static Code Analysis:
- able to find weaknesses or vulnerabilities in the code at the exact location.
- It can scan the entire code base.
- extremely fast if automated tools are used.
- Vulnerabilities are discovered in the early stages of development. Thus, it reduces the cost to fix hidden flaws in the future.
- provides mitigation recommendation to security researchers so they can be aware of possible issues in future development.
- Disadvantages of Static Code Analysis:
- takes longer time if conducted manually
- rarely finds flaws or vulnerabilities in the runtime environment.
- Occasionally, it produces many false positives and false negatives.
- Not all automated tools support multiprogramming languages.
- Advantages of Static Code Analysis:
- Dynamic Code Analysis
- Advantages of Dynamic Code Analysis:
- Unlike static code analysis, it can identify vulnerabilities in a runtime environment.
- examined against any live application.
- capable of
identifying false negatives in static code analysis
. - allows you to examine
whether the results of static code analysis are valid
.
- Disadvantages of Dynamic Code Analysis:
- extremely tedious to trace vulnerabilities back to a specific area or location in the code.
- Automated tools in dynamic code analysis provide false positives and false negatives.
- automated tools in dynamic code analysis may provide a false sense of security.
- Advantages of Dynamic Code Analysis:
5 Tips to a Better Secure Code Review
- Produce code review checklists
to ensure consistency between reviews and by different developers
- all reviewers are working by the same comprehensive checklist . reviewers can forget to certain checks without a well-designed checklist.
- enforce time constraints as well as mandatory breaks for manual code reviewers. especially when looking at high value applications.
- Ensure a positive security culture by not singling out developers
- It can be easy, especially with reporting by some tools being able to compare results over time, to point the finger at developers who routinely make the same mistakes. It’s important when building a security culture to refrain from playing the blame game with developers; this only serves to deepen the gap between security and development. Use your findings to help guide your security education and awareness program, using those common mistakes as a jumping off point and relevant examples developers should be looking out for.
- Again, developers aren’t going to improve in security if they feel someone’s watching over their shoulder, ready to jump at every mistake made. Facilitate their security awareness in more positive ways and your relationship with the development team, but more importantly the organization in general, will reap the benefits.
- Review code each time a meaningful change
in the code has been introduced
- If you have a secure SDLC in place, you understand the value of testing code on a regular basis. Secure code reviews don’t have to wait until just before release. For major applications, we suggest performing manual code reviews when new changes are introduced, saving time and human brainpower by having the app reviewed in chunks.
- A mix of human review and tool use is best to detect all flaws
- Tools aren’t (yet) armed with the mind of a human, and therefore can’t detect issues in the logic of code and are hard-pressed to correctly estimate the risk to the organization if such a flaw is left unfixed in a piece of code. Thus, as we discussed above, a mix of static analysis testing and manual review is the best combination to avoid missing blind spots in the code. Use your teams’ expertise to review more complicated code and valuable areas of the application and rely on automated tools to cover the rest.
- Continuously monitor and track patterns
of insecure code
- By tracking repetitive issues you see between reports and applications, help inform future reviews by modifying your secure code review checklist, as well as your AppSec awareness training. Monitoring your code offers great insight into the patterns that could be the cause of certain flaws, and will help you when you’re updating your review guide.
Top 10 Secure Coding Practices
- Validate input.
- Validate input from all untrusted data sources.
- Proper input validation can eliminate the vast majority of software vulnerabilities.
- Be suspicious of most external data sources, including command line arguments, network interfaces, environmental variables, and user controlled files [Seacord 05].
- Heed 注意 compiler warnings.
- Compile code using the highest warning level available for your compiler and eliminate warnings by modifying the code [C MSC00-A, C++ MSC00-A].
- Use static and dynamic analysis tools to detect and eliminate additional security flaws.
- Architect and design for security policies.
- Create a software architecture and design your software to implement and enforce security policies.
- For example, if your system requires different privileges at different times, consider dividing the system into distinct intercommunicating subsystems, each with an appropriate privilege set.
- Keep it simple.
- Keep the design as simple and small as possible [Saltzer 74, Saltzer 75].
- Complex designs increase the likelihood that errors will be made in their implementation, configuration, and use.
- Additionally, the effort required to achieve an appropriate level of assurance increases dramatically as security mechanisms become more complex.
- Default deny.
- make access decisions base on permission rather than exclusion.
- by default, access is denied and the protection scheme identifies conditions under which access is permitted [Saltzer 74, Saltzer 75].
- Adhere to the principle of least privilege.
- Every process should execute with the least set of privileges necessary to complete the job.
- Any elevated permission should only be accessed for the least amount of time required to complete the privileged task.
- This approach reduces the opportunities an attacker has to execute arbitrary code with elevated privileges [Saltzer 74, Saltzer 75].
- Sanitize data sent to other systems.
- Sanitize all data passed to complex subsystems [C STR02-A] such as command shells, relational databases, and commercial off-the-shelf (COTS) components
- Attackers may be able to invoke unused functionality in these components through the use of SQL, command, or other injection attacks.
- This is not necessarily an input validation problem because the complex subsystem being invoked does not understand the context in which the call is made.
- Because the calling process understands the context, it is responsible for sanitizing the data before invoking the subsystem.
- Practice defense in depth.
- Manage risk with multiple defensive strategies, so that if one layer of defense turns out to be inadequate, another layer of defense can prevent a security flaw from becoming an exploitable vulnerability and/or limit the consequences of a successful exploit.
- For example, combining secure programming techniques with secure runtime environments should reduce the likelihood that vulnerabilities remaining in the code at deployment time can be exploited in the operational environment [Seacord 05].
- Use effective quality assurance techniques.
- Good quality assurance techniques can be effective in identifying and eliminating vulnerabilities. Fuzz testing, penetration testing, and source code audits should all be incorporated as part of an effective quality assurance program. Independent security reviews can lead to more secure systems. External reviewers bring an independent perspective; for example, in identifying and correcting invalid assumptions [Seacord 05].
- Adopt a secure coding standard.
- Develop and/or apply a secure coding standard for your target development language and platform.
Common Techniques
there is no formal documentation for code auditing. However, there are two common techniques preferred by security researchers or bug hunters.
- Static Analysis
- Dynamic Analysis
Static Analysis
Static Code Analysis commonly refers to the running of Static Code Analysis tools
that attempt to highlight possible vulnerabilities within ‘static’ source code by using techniques such as Taint Analysis and Data Flow Analysis
. Below are techniques of static code analysis.
Data Flow Analysis:
- collect dynamic information about data in software while it is in a static state.
The common terms used in data flow analysis are:
- Control Flow Graph:
- It is an abstract representation of software by use of nodes.
- The nodes in a control flow graph represent basic blocks.
- directed edges in a graph are used to
represent paths or routes from one block to another
.
- Lexical Analysis:
- converts the syntax of a source code into a token of information.
- is the first phase of the compiler also known as a scanner. It converts the High level input program into a sequence of Tokens.
- It converts the source code
in order to abstract the source code and make it less difficult to manipulate
. - The output is a sequence of tokens that is sent to the parser for syntax analysis
Example of tokens:
- Type token (id, number, real, . . . )
- Punctuation tokens (IF, void, return, . . . )
- Alphabetic tokens (keywords)
- Taint 瑕疵 Analysis:
- For instance in Perl and Ruby, there is a built-in taint checking mechanism to accept input or data via CGI.
- identify variables tainted with user controllable input and traces them to possible vulnerable functions.
- identifies all source of user data such as inputs, headers and so on. It follows all the data through the system to make sure it gets sanitized before you do anything with it.
- https://shell-storm.org/blog/Taint-analysis-and-pattern-matching-with-Pin/
- Control Flow Graph:
Dynamic Analysis
- performed by executing programs on a real or virtual processor. For dynamic program analysis to be effective, the target program must be executed with sufficient test inputs to produce interesting behavior.
- Use of software testing measures such as code coverage helps ensure that an adequate slice of the program’s set of possible behaviors have been observed.
SDLC
Software Development Life Cycle.
a comprehensive framework that describes the various stages involved in the development of software, from initial planning and requirements gathering to deployment and maintenance.
The SDLC model follows a systematic and iterative approach to ensure that the software is developed efficiently, meets high-quality standards, and aligns with the needs of stakeholders.
The Typical Phases of the Software Development Life Cycle (SDLC):
- Planning and Requirements Gathering:
- the initial phase where the objectives of the software project are defined, potential stakeholders are identified, and their requirements are gathered.
- A detailed project plan is created, outlining the scope, timeline, and budget.
- Defining Requirements:
- the gathered requirements are analyzed, refined, and prioritized.
- Functional and non-functional requirements are specified, providing a clear understanding of what the software needs to achieve.
- Design:
- The software architecture and design are developed during this phase.
- The system’s structure is visualized, and detailed diagrams are created to represent the different components and how they will interact.
- Development:
- The actual coding of the software is performed in this phase.
- The software is developed based on the detailed design specifications and requirements.
- Testing:
- Thorough testing is conducted in this phase to identify and fix bugs, errors, and performance issues in the software.
- Various testing methodologies, such as unit testing, integration testing, and system testing, are applied.
- Deployment:
- Once the software is thoroughly tested and meets the required quality standards, it is deployed to the target environment.
- This may involve installing the software on servers, configuring the system, and training end-users.
- Maintenance:
- This is the ongoing phase where the software is monitored, maintained, and updated to meet changing user needs and address any issues that arise after deployment. Bug fixes, enhancements, and performance improvements are continuously performed.
The SDLC model can be iterative, allowing for continuous feedback and improvement throughout the development process. Different organizations may adopt specific SDLC methodologies, such as Agile, Waterfall, Scrum, Kanban, or Extreme Programming, based on the nature of the project and their organizational preferences.
DEV
Dev (Development):
The ‘Dev’ environment is the initial stage where software development and testing begin.
It is a controlled and isolated environment where developers can work freely, make code changes, experiment with new features, and run tests without affecting the main system.
Characteristics of the Dev environment:
It is a copy of the production system but may not reflect the exact data or configurations.
It is accessible only to developers and a select few testers.
It is used for continuous integration (CI) and continuous deployment (CD) pipelines, where code is automatically built, tested, and deployed to this environment.
UAT
User Acceptance Testing
a critical phase in the software development life cycle (SDLC) where the end-users or client representatives evaluate the software application or system to determine whether it meets their requirements and expectations.
The ‘UAT’ environment is a staging area that bridges the gap between the development and production environments.
It is designed to simulate the real-world production environment as closely as possible.
Once the software is tested and deemed stable in the Dev environment, it is migrated to the UAT environment for final testing by end-users, business stakeholders, and QA testers.
The purpose of UAT is to validate the software’s functionality, usability, and performance in a real-world setting before it is deployed to production.
Characteristics of the UAT environment:
It is a copy of the production environment with realistic data and configurations.
It is accessible to end-users, business stakeholders, and QA testers for testing purposes.
Any issues identified in UAT are addressed before the software is deployed to Prod.
Key Purpose of UAT:
- Enhance User Experience: UAT allows users to interact with the software in a realistic environment, providing feedback on usability, functionality, and user interface issues.
- Confirm Compliance: Users verify that the system aligns with their business processes, regulatory requirements, and data security standards.
- Identify and Fix Errors: UAT helps uncover bugs, bugs, performance problems, or compatibility issues that may have been overlooked during previous testing stages.
- End-user Validation: Ultimately, UAT ensures that the software is fits the purpose for which it was developed and will be adopted successfully by the end-users.
Types of User Acceptance Testing:
Alpha Testing: Performed by internal stakeholders, such as developers, quality assurance testers, or a special testing team, to simulate real-world usage and identify critical issues.
Beta Testing: Involves real end-users in an uncontrolled environment to gather feedback, identify usability problems, and measure performance.
Migration Testing: Used when a system is migrated from an old environment to a new one, ensuring data integrity and smooth transition.
Regression Testing: Verifies that existing features and functionalities are still working correctly after new changes or updates are made.
Integration Testing: Ensures seamless interaction between different software modules or components.
Installation Testing: Compares the actual installation process with the documented requirements, verifies data integrity, and checks for any errors. UAT is a vital step in the software development process, bridging the gap between development and end-user deployment. It helps ensure that the delivered software meets the user’s needs, enhances user satisfaction, and increases the likelihood of a successful software implementation.
PROD
Prod (Production):
The ‘Prod’ environment is the live operational environment where the final, tested, and approved software is running and serving end-users.
It is the actual system that businesses rely on for their day-to-day operations.
Deploying software to the Prod environment should be done with utmost caution and rigorous testing, as any issues can impact real users and business processes.
Characteristics of the Prod environment:
It is the real-world live system with actual data and configurations.
It is accessible only to authorized end-users and business personnel.
Any changes or updates made to the Prod environment are closely monitored and controlled.
Comments powered by Disqus.