-
Notifications
You must be signed in to change notification settings - Fork 4
/
cwe_detection_descriptions.json
483 lines (483 loc) · 209 KB
/
cwe_detection_descriptions.json
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
[
"If utilizing user accounts, attempt to submit a username that contains homoglyphs. Similarly, check to see if links containing homoglyphs can be sent via email, web browsers, or other mechanisms.",
"In theory this weakness can be detected through the use of white box testing techniques where specifically crafted test cases are used in conjunction with debuggers to verify the order of statements being executed.",
"To find the issue in the implementation, manual checks or automated static analysis could be applied to the XML configuration files.",
"To find the issue in the implementation, manual checks or automated static analysis could be applied to the XML configuration files.",
"This weakness can often be detected using automated static analysis tools. Many modern tools use data flow analysis or constraint-based techniques to minimize the number of false positives.",
"This weakness can be detected using dynamic tools and techniques that interact with the software using large test suites with many diverse inputs, such as fuzz testing (fuzzing), robustness testing, and fault injection. The software's operation may slow down, but it should not become unstable, crash, or generate incorrect results.",
"\n Some instances of improper input validation can be detected using automated static analysis.\n A static analysis tool might allow the user to specify which application-specific methods or functions perform input validation; the tool might also have built-in knowledge of validation frameworks such as Struts. The tool may then suppress or de-prioritize any associated warnings. This allows the analyst to focus on areas of the software in which input validation does not appear to be present.\n Except in the cases described in the previous paragraph, automated static analysis might not be able to recognize when proper input validation is being performed, leading to false positives - i.e., warnings that do not have any security consequences or require any code changes.\n ",
"\n Kernel integrity verification can help identify when shared resource configuration settings have been modified.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n Cost effective for partial coverage:\n \n \n Source Code Quality Analyzer\n \n \n \n ",
"\n This weakness can often be detected using automated static analysis tools. Many modern tools use data flow analysis or constraint-based techniques to minimize the number of false positives.\n Automated static analysis generally does not account for environmental considerations when reporting out-of-bounds memory operations. This can make it difficult for users to determine which warnings should be investigated first. For example, an analysis tool might report buffer overflows that originate from command line arguments in a program that is not expected to run with setuid or other special privileges.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Fuzz Tester\n Framework-based Fuzzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n Manual Source Code Review (not inspections)\n \n \n \n ",
"This weakness can be detected using dynamic tools and techniques that interact with the software using large test suites with many diverse inputs, such as fuzz testing (fuzzing), robustness testing, and fault injection. The software's operation may slow down, but it should not become unstable, crash, or generate incorrect results.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode Quality Analysis\n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n Binary Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"Authentication and authorization of debug and test interfaces should be part of the architecture and design review process. Withholding of private register documentation from the debug and test interface public specification (\"Security by obscurity\") should not be considered as sufficient security.",
"Dynamic tests should be done in the pre-silicon and post-silicon stages to verify that the debug and test interfaces are not open by default.",
"Tests that fuzz Debug and Test Interfaces should ensure that no access without appropriate authentication and authorization is possible.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n This weakness can often be detected using automated static analysis tools. Many modern tools use data flow analysis or constraint-based techniques to minimize the number of false positives.\n Automated static analysis generally does not account for environmental considerations when reporting out-of-bounds memory operations. This can make it difficult for users to determine which warnings should be investigated first. For example, an analysis tool might report buffer overflows that originate from command line arguments in a program that is not expected to run with setuid or other special privileges.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n Binary Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Fuzz Tester\n Framework-based Fuzzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"Manual analysis can be useful for finding this weakness, but it might not achieve desired code coverage within limited time constraints. This becomes difficult for weaknesses that must be considered for all inputs, since the attack surface can be too large.",
"This weakness can be detected using dynamic tools and techniques that interact with the software using large test suites with many diverse inputs, such as fuzz testing (fuzzing), robustness testing, and fault injection. The software's operation may slow down, but it should not become unstable, crash, or generate incorrect results.",
"Set the lock bit. Power cycle the\n\t device. Attempt to clear the lock bit. If the\n\t information is changed, implement a design\n\t fix. Retest. Also, attempt to indirectly clear the lock\n\t bit or bypass it.",
"Set the lock bit. Attempt to modify the\n\t information protected by the lock bit. If the information\n\t is changed, implement a design fix. Retest. Also, attempt\n\t to indirectly clear the lock bit or bypass\n\t it.",
"\n\t\tIt needs to be determined if the output of a cryptographic primitive is lacking entropy, which is one clear sign that something went wrong with the crypto implementation. There exist many methods of measuring the entropy of a bytestream, from sophisticated ones (like calculating Shannon's entropy of a sequence of characters) to crude ones (by compressing it and comparing the size of the original bytestream vs. the compressed - a truly random byte stream should not be compressible and hence the uncompressed and compressed bytestreams should be nearly identical in size).",
"For hardware, during the implementation (pre-Silicon / post-Silicon) phase, dynamic tests should be done to ensure that outputs from cryptographic routines are indeed working properly, such as test vectors provided by NIST [REF-1236].",
"Review requirements, documentation, and product design to ensure that primitives are consistent with the strongest-available recommendations from trusted parties. If the product appears to be using custom or proprietary implementations that have not had sufficient public review and approval, then this is a significant concern.",
"Analyze the product to ensure that implementations for each primitive do not contain any known vulnerabilities and are not using any known-weak algorithms, including MD4, MD5, SHA1, DES, etc.",
"Check 2 devices for their passcode to authenticate access to JTAG/debugging ports. If the passcodes are missing or the same, update the design to fix and retest. Check communications over JTAG/debugging ports for encryption. If the communications are not encrypted, fix the design and retest.",
"\n\t\t\t\n\t\t\tPut the processor in an infinite\n\t\t\tloop, which is then followed by instructions\n\t\t\tthat should not ever be executed, since the\n\t\t\tloop is not expected to exit. After the loop,\n\t\t\ttoggle an I/O bit (for oscilloscope monitoring\n\t\t\tpurposes), print a console message, and\n\t\t\treenter the loop. Note that to ensure that\n\t\t\tthe loop exit is actually captured, many NOP\n\t\t\tinstructions should be coded after the loop\n\t\t\tbranch instruction and before the I/O bit\n\t\t\ttoggle and the print statement.\n\n\t\t\tMargining the clock consists of varying the clock\n\t\t\tfrequency until an anomaly occurs. This could be a\n\t\t\tcontinuous frequency change or it could be a single\n\t\t\tcycle. The single cycle method is described here. For\n\t\t\tevery 1000th clock pulse, the clock cycle is shortened by\n\t\t\t10 percent. If no effect is observed, the width is\n\t\t\tshortened by 20%. This process is continued in 10%\n\t\t\tincrements up to and including 50%. Note that the cycle\n\t\t\ttime may be increased as well, down to seconds per\n\t\t\tcycle.\n\n\t\t\tSeparately, the voltage is margined. Note that\n\t\t\tthe voltage could be increased or decreased. Increasing\n\t\t\tthe voltage has limits, as the circuitry may not be able\n\t\t\tto withstand a drastically increased voltage. This process\n\t\t\tstarts with a 5% reduction of the DC supply to the CPU\n\t\t\tchip for 5 millisecond repeated at 1KHz. If this has no\n\t\t\teffect, the process is repeated, but a 10% reduction is\n\t\t\tused. This process is repeated at 10% increments down to a\n\t\t\t50% reduction. If no effects are observed at 5\n\t\t\tmillisecond, the whole process is repeated using a 10\n\t\t\tmillisecond pulse. If no effects are observed, the process\n\t\t\tis repeated in 10 millisecond increments out to 100\n\t\t\tmillisecond pulses.\n\n\t\t\tWhile these are suggested starting points for\n\t\t\ttesting circuitry for weaknesses, the limits may need to\n\t\t\tbe pushed further at the risk of device damage. See\n\t\t\t[REF-1217] for descriptions of Smart Card attacks against\n\t\t\ta clock (section 14.6.2) and using a voltage glitch\n\t\t\t(section 15.5.3).\n\t\t ",
"\n\t\t Many SoCs come equipped with a built-in Dynamic Voltage and Frequency Scaling (DVFS) that can control the voltage and clocks via software alone. However, there have been demonstrated attacks (like Plundervolt and CLKSCREW) that target this DVFS [REF-1081] [REF-1082]. During the design and implementation phases, one needs to check if the interface to this power management feature is available from unprivileged SW (CWE-1256), which would make the attack very easy.\n\t\t ",
"\n\t\t Review if the protections against glitching merely transfer the attack target. For example, suppose a critical authentication routine that an attacker would want to bypass is given the protection of modifying certain artifacts from within that specific routine (so that if the routine is bypassed, one can examine the artifacts and figure out that an attack must have happened). However, if the attacker has the ability to bypass the critical authentication routine, they might also have the ability to bypass the other protection routine that checks the artifacts. Basically, depending on these kind of protections is akin to resorting to \"Security by Obscurity\".\n\t\t ",
"\n\t\t During the implementation phase where actual hardware is available, specialized hardware tools and apparatus such as ChipWhisperer may be used to check if the platform is indeed susceptible to voltage and clock glitching attacks.\n\t\t ",
"\n\t\t Use custom software to change registers that control clock settings or power settings to try to bypass security locks, or repeatedly write DRAM to try to change adjacent locations. This can be effective in extracting or changing data. The drawback is that it cannot be run before manufacturing, and it may require specialized software.\n\t\t",
"Perform a security evaluation of system-level\n\t\tarchitecture and design with software-aided physical attacks\n\t\tin scope.",
"Create a high privilege memory block of any arbitrary size. Attempt to create a lower privilege memory block with an overlap of the high privilege memory block. If the creation attempt works, fix the hardware. Repeat the test.",
"Functional simulation is applicable during the Implementation Phase. Testcases must be created and executed for memory mapped registers to verify adherence to the access control policy. This method can be effective, since functional verification needs to be performed on the design, and verification for this weakness will be included. There can be difficulty covering the entire memory space during the test.",
"Information flow tracking can be applicable during the Implementation phase. Security sensitive data (assets) - for example, as stored in registers - is automatically tracked over time through the design to verify the data doesn't reach illegal destinations that violate the access policies for the memory map. This method can be very effective when used together with simulation and emulation, since detecting violations doesn't rely on specific scenarios or data values. This method does rely on simulation and emulation, so testcases must exist in order to use this method.",
"Perform penetration testing (either manual or semi-automated with fuzzing) to verify that access control mechanisms such as the memory protection units or on-chip bus firewall settings adequately protect critical hardware registers from software access.",
"This is applicable in the Architecture phase before implementation started. Make sure access policy is specified for the entire memory map. Manual analysis may not ensure the implementation is correct.",
"Manual documentation review of the system memory map, register specification, and permissions associated with accessing security-relevant functionality exposed via memory-mapped registers.",
"Registers controlling hardware should have access control implemented. This access control may be checked manually for correct implementation. Items to check consist of how are trusted parties set, how are trusted parties verified, how are accesses verified, etc. Effectiveness of a manual analysis will vary depending upon how complicated the interface is constructed.",
"Formal verification is applicable during the Implementation phase. Assertions need to be created in order to capture illegal register access scenarios and prove that they cannot occur. Formal methods are exhaustive and can be very effective, but creating the cases for large designs may be complex and difficult.",
"Write a known pattern into each sensitive location. Enter the power/debug state in question. Read data back from the sensitive locations. If the reads are successful, and the data is the same as the pattern that was originally written, the test fails and the device needs to be fixed. Note that this test can likely be automated.",
"\n\t\t\t Analyze the device using the following steps:\n\t\t\t \n\t\t\t\t1) Identify all fabric master agents that are active during system Boot Flow when initial code is loaded from Non-volatile storage to volatile memory.\n\t\t\t\t2) Identify the volatile memory regions that are used for storing loaded system executable program.\n\t\t\t\t3) During system boot, test programming the identified memory regions in step 2 from all the masters identified in step 1.\n\t\t\t \n\t\t\t Only trusted masters should be allowed to write to the memory regions. For example, pluggable device peripherals should not have write access to program load memory regions.\n\t\t\t ",
"Ensure the volatile memory is lockable or has locks. Ensure the volatile memory is locked for writes from untrusted agents or adversaries. Try modifying the volatile memory from an untrusted agent, and ensure these writes are dropped.\n\t\t\t ",
"Determine if there is a lack of a capability to update read-only memory structure. This could manifest as a difference between the latest firmware version and current version within the device.",
"Check the consumer or maintainer documentation, the architecture/design documentation, or the original requirements to ensure that the documentation includes details for how to update the firmware.",
"Create a new installable boot image of the current build with a minor version number change. Use the standard installation method to update the boot image. Verify that the minor version number has changed. Create a fake image. Verify that the boot updater will not install the fake image and generates an invalid image error message.",
"Black box methods might not get the needed code coverage within limited time constraints, and a dynamic test might not produce any noticeable side effects even if it is successful.",
"\n This weakness can often be detected using automated static analysis tools. Many modern tools use data flow analysis or constraint-based techniques to minimize the number of false positives.\n Automated static analysis generally does not account for environmental considerations when reporting out-of-bounds memory operations. This can make it difficult for users to determine which warnings should be investigated first. For example, an analysis tool might report array index errors that originate from command line arguments in a program that is not expected to run with setuid or other special privileges.\n ",
"This weakness can be detected using dynamic tools and techniques that interact with the software using large test suites with many diverse inputs, such as fuzz testing (fuzzing), robustness testing, and fault injection. The software's operation may slow down, but it should not become unstable, crash, or generate incorrect results.",
"\n\t\t\t\t\t\tCompare the debug key with the production key to make sure that they are not the same.\n\t\t\t\t\t",
"\n\t\t\t\t\t\tCompare the debug key with the production key to make sure that they are not the same.\n\t\t\t\t\t",
"Appropriate Post-Si tests should be carried out at various authorization levels to ensure that debug components are properly chained and accessible only to users with appropriate credentials.",
"Appropriate Post-Si tests should be carried out at various authorization levels to ensure that debug components are properly chained and accessible only to users with appropriate credentials.",
"Appropriate Post-Si tests should be carried out to ensure that residual confidential information is not left on parts leaving one facility for another facility.",
"Appropriate Post-Si tests should be carried out to ensure that residual confidential information is not left on parts leaving one facility for another facility.",
"\n\t\t\t\t Post-silicon, perform full side-channel attacks (penetration testing) covering as many known leakage models as possible against test code.",
"Perform a set of leakage detection tests such as the procedure outlined in the Test Vector Leakage Assessment (TVLA) test requirements for AES [REF-1230]. TVLA is the basis for the ISO standard 17825 [REF-1229]. A separate methodology is provided by [REF-1228]. Note that sole reliance on this method might not yield expected results [REF-1239] [REF-1240].",
"\n\t\t\t\t Pre-silicon - while the aforementioned TVLA methods can be performed post-silicon, models of device power consumption or other physical emanations can be built from information present at various stages of the hardware design process before fabrication. TVLA or known side-channel attacks can be applied to these simulated traces and countermeasures applied before tape-out. Academic research in this field includes [REF-1231] [REF-1232] [REF-1233].",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n Binary Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"This weakness can be detected using dynamic tools and techniques that interact with the software using large test suites with many diverse inputs, such as fuzz testing (fuzzing), robustness testing, and fault injection. The software's operation may slow down, but it should not become unstable, crash, or generate incorrect results.",
"Manual analysis can be useful for finding this weakness, but it might not achieve desired code coverage within limited time constraints. This becomes difficult for weaknesses that must be considered for all inputs, since the attack surface can be too large.",
"\n This weakness can be detected using tools and techniques that require manual (human) analysis, such as penetration testing, threat modeling, and interactive tools that allow the tester to record and modify an active session.\n Specifically, manual static analysis is useful for evaluating the correctness of allocation calculations. This can be useful for detecting overflow conditions (CWE-190) or similar weaknesses that might have serious security impacts on the program.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n Cost effective for partial coverage:\n \n \n Source Code Quality Analyzer\n \n \n \n ",
"\n This weakness can often be detected using automated static analysis tools. Many modern tools use data flow analysis or constraint-based techniques to minimize the number of false positives.\n Automated static analysis generally does not account for environmental considerations when reporting potential errors in buffer calculations. This can make it difficult for users to determine which warnings should be investigated first. For example, an analysis tool might report buffer overflows that originate from command line arguments in a program that is not expected to run with setuid or other special privileges.\n ",
"Using an external debugger, send write transactions to mirrored regions to test if original, write-protected regions are modified. Similarly, send read transactions to mirrored regions to test if the original, read-protected signals can be read.",
"Review address map in specification to see if there are any overlapping ranges.",
"Negative testing of access control on overlapped ranges.",
"Formal verification of bridge RTL to ensure that access control cannot be bypassed. ",
"RTL simulation to ensure that bridge-access controls are implemented properly.",
"Lack of security features can also be confirmed through manual RTL review of the fabric RTL. ",
"Review the fabric specification and ensure that it contains signals to transfer security-sensitive signals. ",
"Automated testing can verify that RoT components are immutable.",
"Root of trust elements and memory should be part of architecture and design reviews.",
"Anti-roll-back features should be reviewed as part of Architecture or Design review.",
"Mutability of stored security version numbers and programming with older firmware images should be part of automated testing.",
"\n\t\t\t\t\t\n\t\t\t\t\tTesting of memory-device contents after clearing or erase commands.\n\t\t\t\t\tDynamic analysis of memory contents during device operation to detect specific, confidential assets.\n\t\t\t\t\tArchitecture and design analysis of memory clear and erase operations.\n\t\t\t\t\t\n\t\t\t\t\t",
"\n\t\t\t\t\t\n\t\t\t\t\tTesting of memory-device contents after clearing or erase commands.\n\t\t\t\t\tDynamic analysis of memory contents during device operation to detect specific, confidential assets.\n\t\t\t\t\tArchitecture and design analysis of memory clear and erase operations.\n\t\t\t\t\t\n\t\t\t\t\t",
"Providing marker flags to send through the interfaces coupled with examination of which users are able to read or manipulate the flags will help verify that the proper isolation has been achieved and is effective.",
"This weakness can be found using automated dynamic analysis. Both emulation of a CPU with instruction skips, as well as RTL simulation of a CPU IP, can indicate parts of the code that are sensitive to faults due to instruction skips.",
"This weakness can be found using manual (static) analysis. The analyst has security objectives that are matched against the high-level code. This method is less precise than emulation, especially if the analysis is done at the higher level language rather than at assembly level.",
"This weakness can be found using automated static analysis once a developer has indicated which code paths are critical to protect.",
"Power management controls should be part of Architecture and Design reviews.",
"Dynamic tests should be performed to stress-test temperature controls.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n Binary Weakness Analysis - including disassembler + source code weakness analysis\n \n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode simple extractor - strings, ELF readers, etc.\n \n \n \n ",
"Since format strings often occur in rarely-occurring erroneous conditions (e.g. for error message logging), they can be difficult to detect using black box methods. It is highly likely that many latent issues exist in executables that do not have associated source code (or equivalent source.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Fuzz Tester\n Framework-based Fuzzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Manual Source Code Review (not inspections)\n \n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"This weakness can often be detected using automated static analysis tools. Many modern tools use data flow analysis or constraint-based techniques to minimize the number of false positives.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n Cost effective for partial coverage:\n \n \n Warning Flags\n \n \n \n ",
"Some compiler instrumentation tools such as AddressSanitizer (ASan) can indirectly detect some instances of this weakness.",
"For commonly-used APIs and resource types, automated tools often have signatures that can spot this issue.",
"This specific weakness is impossible to detect using black box methods. While an analyst could examine memory to see that it has not been scrubbed, an analysis of the executable would not be successful. This is because the compiler has already removed the relevant code. Only the source code shows whether the programmer intended to clear the memory or not, so this weakness is indistinguishable from others.",
"This weakness is only detectable using white box methods (see black box detection factor). Careful analysis is required to determine if the code is likely to be removed by the compiler.",
"Exploitation of a vulnerability with commonly-used manipulations might fail, but minor variations might succeed.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"This weakness can often be detected using automated static analysis tools. Many modern tools use data flow analysis or constraint-based techniques to minimize the number of false positives.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Fuzz Tester\n Framework-based Fuzzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n This weakness can be detected using tools and techniques that require manual (human) analysis, such as penetration testing, threat modeling, and interactive tools that allow the tester to record and modify an active session.\n Specifically, manual static analysis is useful for evaluating the correctness of allocation calculations. This can be useful for detecting overflow conditions (CWE-190) or similar weaknesses that might have serious security impacts on the program.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n Binary Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Manual Source Code Review (not inspections)\n \n \n \n ",
"Sometimes, evidence of this weakness can be detected using dynamic tools and techniques that interact with the software using large test suites with many diverse inputs, such as fuzz testing (fuzzing), robustness testing, and fault injection. The software's operation may slow down, but it should not become unstable, crash, or generate incorrect results.",
"Because byte ordering bugs are usually very noticeable even with normal inputs, this bug is more likely to occur in rarely triggered error conditions, making them difficult to detect using black box methods.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n Some instances of improper input validation can be detected using automated static analysis.\n A static analysis tool might allow the user to specify which application-specific methods or functions perform input validation; the tool might also have built-in knowledge of validation frameworks such as Struts. The tool may then suppress or de-prioritize any associated warnings. This allows the analyst to focus on areas of the software in which input validation does not appear to be present.\n Except in the cases described in the previous paragraph, automated static analysis might not be able to recognize when proper input validation is being performed, leading to false positives - i.e., warnings that do not have any security consequences or require any code changes.\n ",
"Fuzzing techniques can be useful for detecting input validation errors. When unexpected inputs are provided to the software, the software should not crash or otherwise become unstable, and it should generate application-controlled error messages. If exceptions or interpreter-generated error messages occur, this indicates that the input was not detected and handled within the application logic itself.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n Binary Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Attack Modeling\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Fuzz Tester\n Framework-based Fuzzer\n \n \n Cost effective for partial coverage:\n \n \n Host Application Interface Scanner\n Monitored Virtual Environment - run potentially malicious code in sandbox / wrapper / virtual machine, see if it does anything suspicious\n \n \n \n ",
"When custom input validation is required, such as when enforcing business rules, manual analysis is necessary to ensure that the validation is properly implemented.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Attack Modeling\n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n Inter-application Flow Analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Context-configured Source Code Weakness Analyzer\n \n \n Cost effective for partial coverage:\n \n \n Source code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Fuzz Tester\n Framework-based Fuzzer\n Automated Monitored Execution\n Monitored Virtual Environment - run potentially malicious code in sandbox / wrapper / virtual machine, see if it does anything suspicious\n \n \n \n ",
"Automated methods may be able to detect certain idioms automatically, such as exposed stack traces or pathnames, but violation of business rules or privacy requirements is not typically feasible.",
"Identify error conditions that are not likely to occur during normal usage and trigger them. For example, run the program under low memory conditions, run with insufficient privileges or permissions, interrupt a transaction before it is completed, or disable connectivity to basic network services such as DNS. Monitor the software for any unexpected behavior. If you trigger an unhandled exception or similar error that was discovered and handled by the application's environment, it may still indicate unexpected conditions that were not handled by the application itself.",
"This weakness generally requires domain-specific interpretation using manual analysis. However, the number of potential error conditions may be too large to cover completely within limited time constraints.",
"\n This weakness can be detected using dynamic tools and techniques that interact with the software using large test suites with many diverse inputs, such as fuzz testing (fuzzing), robustness testing, and fault injection. The software's operation may slow down, but it should not become unstable, crash, or generate incorrect results.\n Error conditions may be triggered with a stress-test by calling the software simultaneously from a large number of threads or processes, and look for evidence of any unexpected behavior.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n \n \n Cost effective for partial coverage:\n \n \n Binary Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"Manual white box techniques may be able to provide sufficient code coverage and reduction of false positives if all file access operations can be assessed within limited time constraints.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Manual Source Code Review (not inspections)\n \n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Fuzz Tester\n Framework-based Fuzzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"Automated techniques can find areas where path traversal weaknesses exist. However, tuning or customization may be required to remove or de-prioritize path-traversal problems that are only exploitable by the software's administrator - or other privileged users - and thus potentially valid behavior or, at worst, a bug instead of a vulnerability.",
"Write a known pattern into each sensitive location. Trigger the release of the resource or cause the desired state transition to occur. Read data back from the sensitive locations. If the reads are successful, and the data is the same as the pattern that was originally written, the test fails and the product needs to be fixed. Note that this test can likely be automated.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Attack Modeling\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Host Application Interface Scanner\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Host-based Vulnerability Scanners - Examine configuration for flaws, verifying that audit mechanisms work, ensure host configuration meets certain predefined criteria\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Compare binary / bytecode to application permission manifest\n \n \n Cost effective for partial coverage:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n Binary Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Manual Source Code Review (not inspections)\n \n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n \n \n \n ",
"This weakness can be detected using tools and techniques that require manual (human) analysis, such as penetration testing, threat modeling, and interactive tools that allow the tester to record and modify an active session.",
"\n Use monitoring tools that examine the software's process as it interacts with the operating system and the network. This technique is useful in cases when source code is unavailable, if the software was not developed by you, or if you want to verify that the build phase did not introduce any new weaknesses. Examples include debuggers that directly attach to the running process; system-call tracing utilities such as truss (Solaris) and strace (Linux); system activity monitors such as FileMon, RegMon, Process Monitor, and other Sysinternals utilities (Windows); and sniffers and protocol analyzers that monitor network traffic.\n Attach the monitor to the process and perform a login. Look for library functions and system calls that indicate when privileges are being raised or dropped. Look for accesses of resources that are restricted to normal users.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Configuration Checker\n Permission Manifest Analysis\n \n \n \n ",
"\n Use monitoring tools that examine the software's process as it interacts with the operating system and the network. This technique is useful in cases when source code is unavailable, if the software was not developed by you, or if you want to verify that the build phase did not introduce any new weaknesses. Examples include debuggers that directly attach to the running process; system-call tracing utilities such as truss (Solaris) and strace (Linux); system activity monitors such as FileMon, RegMon, Process Monitor, and other Sysinternals utilities (Windows); and sniffers and protocol analyzers that monitor network traffic.\n Attach the monitor to the process and perform a login. Using disassembled code, look at the associated instructions and see if any of them appear to be comparing the input to a fixed string or value.\n ",
"This weakness can be detected using tools and techniques that require manual (human) analysis, such as penetration testing, threat modeling, and interactive tools that allow the tester to record and modify an active session.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Attack Modeling\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Permission Manifest Analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Host-based Vulnerability Scanners - Examine configuration for flaws, verifying that audit mechanisms work, ensure host configuration meets certain predefined criteria\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Compare binary / bytecode to application permission manifest\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Manual Source Code Review (not inspections)\n \n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Host Application Interface Scanner\n \n \n Cost effective for partial coverage:\n \n \n Fuzz Tester\n Framework-based Fuzzer\n Automated Monitored Execution\n Forced Path Execution\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Inter-application Flow Analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Manual Source Code Review (not inspections)\n \n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Configuration Checker\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Host-based Vulnerability Scanners - Examine configuration for flaws, verifying that audit mechanisms work, ensure host configuration meets certain predefined criteria\n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n Automated static analysis is useful for detecting commonly-used idioms for authorization. A tool may be able to analyze related configuration files, such as .htaccess in Apache web servers, or detect the usage of commonly-used authorization libraries.\n Generally, automated static analysis tools have difficulty detecting custom authorization schemes. In addition, the software's design may include some functionality that is accessible to any user and does not require an authorization check; an automated technique that detects the absence of authorization may report false positives.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n This weakness can be detected using tools and techniques that require manual (human) analysis, such as penetration testing, threat modeling, and interactive tools that allow the tester to record and modify an active session.\n Specifically, manual static analysis is useful for evaluating the correctness of custom authorization mechanisms.\n ",
"Automated dynamic analysis may find many or all possible interfaces that do not require authorization, but manual analysis is required to determine if the lack of authorization violates business logic",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Host Application Interface Scanner\n Fuzz Tester\n Framework-based Fuzzer\n Forced Path Execution\n Monitored Virtual Environment - run potentially malicious code in sandbox / wrapper / virtual machine, see if it does anything suspicious\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n Formal Methods / Correct-By-Construction\n \n \n \n ",
"\n Automated static analysis is useful for detecting certain types of authentication. A tool may be able to analyze related configuration files, such as .htaccess in Apache web servers, or detect the usage of commonly-used authentication libraries.\n Generally, automated static analysis tools have difficulty detecting custom authentication schemes. In addition, the software's design may include some functionality that is accessible to any user and does not require an established identity; an automated technique that detects the absence of authentication may report false positives.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Fuzz Tester\n Framework-based Fuzzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Configuration Checker\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n This weakness can be detected using tools and techniques that require manual (human) analysis, such as penetration testing, threat modeling, and interactive tools that allow the tester to record and modify an active session.\n Manual static analysis is useful for evaluating the correctness of custom authentication mechanisms.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Web Application Scanner\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n Binary Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Man-in-the-middle attack tool\n \n \n \n ",
"Set up an untrusted endpoint (e.g. a server) with which the software will connect. Create a test certificate that uses an invalid hostname but is signed by a trusted CA and provide this certificate from the untrusted endpoint. If the software performs any operations instead of disconnecting and reporting an error, then this indicates that the hostname is not being checked and the test certificate has been accepted.",
"When Certificate Pinning is being used in a mobile application, consider using a tool such as Spinner [REF-955]. This methodology might be extensible to other technologies.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Attack Modeling\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"\n This weakness can be detected using tools and techniques that require manual (human) analysis, such as penetration testing, threat modeling, and interactive tools that allow the tester to record and modify an active session.\n Specifically, manual static analysis is useful for evaluating the correctness of custom authentication mechanisms.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n Automated static analysis is useful for detecting commonly-used idioms for authentication. A tool may be able to analyze related configuration files, such as .htaccess in Apache web servers, or detect the usage of commonly-used authentication libraries.\n Generally, automated static analysis tools have difficulty detecting custom authentication schemes. In addition, the software's design may include some functionality that is accessible to any user and does not require an established identity; an automated technique that detects the absence of authentication may report false positives.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Host Application Interface Scanner\n Fuzz Tester\n Framework-based Fuzzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n Cost effective for partial coverage:\n \n \n Host-based Vulnerability Scanners - Examine configuration for flaws, verifying that audit mechanisms work, ensure host configuration meets certain predefined criteria\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Fuzz Tester\n Framework-based Fuzzer\n \n \n Cost effective for partial coverage:\n \n \n Forced Path Execution\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Configuration Checker\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"The characterizaton of sensitive data often requires domain-specific understanding, so manual methods are useful. However, manual efforts might not achieve desired code coverage within limited time constraints. Black box methods may produce artifacts (e.g. stored data or unencrypted network transfer) that require manual evaluation.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"Automated measurement of the entropy of an input/output source may indicate the use or lack of encryption, but human analysis is still required to distinguish intentionally-unencrypted data (e.g. metadata) from sensitive data.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Attack Modeling\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Network Sniffer\n \n \n Cost effective for partial coverage:\n \n \n Fuzz Tester\n Framework-based Fuzzer\n Automated Monitored Execution\n Man-in-the-middle attack tool\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n Use monitoring tools that examine the software's process as it interacts with the operating system and the network. This technique is useful in cases when source code is unavailable, if the software was not developed by you, or if you want to verify that the build phase did not introduce any new weaknesses. Examples include debuggers that directly attach to the running process; system-call tracing utilities such as truss (Solaris) and strace (Linux); system activity monitors such as FileMon, RegMon, Process Monitor, and other Sysinternals utilities (Windows); and sniffers and protocol analyzers that monitor network traffic.\n Attach the monitor to the process, trigger the feature that sends the data, and look for the presence or absence of common cryptographic functions in the call tree. Monitor the network and determine if the data packets contain readable commands. Tools exist for detecting if certain encodings are in use. If the traffic contains high entropy, this might indicate the usage of encryption.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Configuration Checker\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n Binary Weakness Analysis - including disassembler + source code weakness analysis\n Binary / Bytecode simple extractor - strings, ELF readers, etc.\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"Automated methods may be useful for recognizing commonly-used libraries or features that have become obsolete.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Man-in-the-middle attack tool\n \n \n Cost effective for partial coverage:\n \n \n Framework-based Fuzzer\n Automated Monitored Execution\n Monitored Virtual Environment - run potentially malicious code in sandbox / wrapper / virtual machine, see if it does anything suspicious\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"This weakness can be detected using tools and techniques that require manual (human) analysis, such as penetration testing, threat modeling, and interactive tools that allow the tester to record and modify an active session.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Manual Source Code Review (not inspections)\n \n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n \n \n \n ",
"\n Use monitoring tools that examine the software's process as it interacts with the operating system and the network. This technique is useful in cases when source code is unavailable, if the software was not developed by you, or if you want to verify that the build phase did not introduce any new weaknesses. Examples include debuggers that directly attach to the running process; system-call tracing utilities such as truss (Solaris) and strace (Linux); system activity monitors such as FileMon, RegMon, Process Monitor, and other Sysinternals utilities (Windows); and sniffers and protocol analyzers that monitor network traffic.\n Attach the monitor to the process and look for library functions that indicate when randomness is being used. Run the process multiple times to see if the seed changes. Look for accesses of devices or equivalent resources that are commonly used for strong (or weak) randomness, such as /dev/urandom on Linux. Look for library or system calls that access predictable information such as process IDs and system time.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Man-in-the-middle attack tool\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n Binary Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n Formal Methods / Correct-By-Construction\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Fuzz Tester\n Framework-based Fuzzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n Formal Methods / Correct-By-Construction\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n Manual Source Code Review (not inspections)\n \n \n \n ",
"CSRF is currently difficult to detect reliably using automated techniques. This is because each application has its own implicit security policy that dictates which requests can be influenced by an outsider and automatically performed on behalf of a user, versus which requests require strong confidence that the user intends to make the request. For example, a keyword search of the public portion of a web site is typically expected to be encoded within a link that can be launched automatically when the user clicks on the link.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n This weakness can be detected using tools and techniques that require manual (human) analysis, such as penetration testing, threat modeling, and interactive tools that allow the tester to record and modify an active session.\n Specifically, manual analysis can be useful for finding this weakness, and for minimizing false positives assuming an understanding of business logic. However, it might not achieve desired code coverage within limited time constraints. For black-box analysis, if credentials are not known for privileged accounts, then the most security-critical portions of the application may not receive sufficient attention.\n Consider using OWASP CSRFTester to identify potential issues and aid in manual analysis.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Web Application Scanner\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n Binary Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"\n Private personal data can enter a program in a variety of ways:\n \n \n Directly from the user in the form of a password or personal information\n Accessed from a database or other data store by the application\n Indirectly from a partner or other third party\n \n If the data is written to an external location - such as the console, file system, or network - a privacy violation may occur.\n \n ",
"\n This weakness can be detected using dynamic tools and techniques that interact with the software using large test suites with many diverse inputs, such as fuzz testing (fuzzing), robustness testing, and fault injection. The software's operation may slow down, but it should not become unstable, crash, or generate incorrect results.\n Race conditions may be detected with a stress-test by calling the software simultaneously from a large number of threads or processes, and look for evidence of any unexpected behavior.\n Insert breakpoints or delays in between relevant code statements to artificially expand the race window so that it will be easier to detect.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Manual Source Code Review (not inspections)\n \n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"Common idioms are detectable in white box analysis, such as time-of-check-time-of-use (TOCTOU) file operations (CWE-367), or double-checked locking (CWE-609).",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"Black box methods may be able to identify evidence of race conditions via methods such as multiple simultaneous connections, which may cause the software to become instable or crash. However, race conditions with very narrow timing windows would not be detectable.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n \n \n Cost effective for partial coverage:\n \n \n Binary Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Framework-based Fuzzer\n \n \n Cost effective for partial coverage:\n \n \n Fuzz Tester\n Monitored Virtual Environment - run potentially malicious code in sandbox / wrapper / virtual machine, see if it does anything suspicious\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Framework-based Fuzzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n Binary Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"Certain automated dynamic analysis techniques may be effective in spotting resource exhaustion problems, especially with resources such as processes, memory, and connections. The technique may involve generating a large number of requests to the software within a short time frame.",
"While fuzzing is typically geared toward finding low-level implementation bugs, it can inadvertently find resource exhaustion problems. This can occur when the fuzzer generates a large number of test cases but does not restart the targeted software in between test cases. If an individual test case produces a crash, but it does not do so reliably, then an inability to handle resource exhaustion may be the cause.",
"\n Automated static analysis typically has limited utility in recognizing resource exhaustion problems, except for program-independent system resources such as files, sockets, and processes. For system resources, automated static analysis may be able to detect circumstances in which resources are not released after they have expired. Automated analysis of configuration files may be able to detect settings that do not specify a maximum value.\n Automated static analysis tools will not be appropriate for detecting exhaustion of custom resources, such as an intended security policy in which a bulletin board user is only allowed to make a limited number of posts per day.\n ",
"\n This weakness can be detected using dynamic tools and techniques that interact with the software using large test suites with many diverse inputs, such as fuzz testing (fuzzing), robustness testing, and fault injection. The software's operation may slow down, but it should not become unstable, crash, or generate incorrect results.\n Resource clean up errors might be detected with a stress-test by calling the software simultaneously from a large number of threads or processes, and look for evidence of any unexpected behavior. The software's operation may slow down, but it should not become unstable, crash, or generate incorrect results.\n ",
"Identify error conditions that are not likely to occur during normal usage and trigger them. For example, run the program under low memory conditions, run with insufficient privileges or permissions, interrupt a transaction before it is completed, or disable connectivity to basic network services such as DNS. Monitor the software for any unexpected behavior. If you trigger an unhandled exception or similar error that was discovered and handled by the application's environment, it may still indicate unexpected conditions that were not handled by the application itself.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Fuzz Tester\n Framework-based Fuzzer\n \n \n \n ",
"Automated code analysis techniques might not be able to reliably detect this weakness, since the application's behavior and general security model dictate which resource locks are critical. Interpretation of the weakness might require knowledge of the environment, e.g. if the existence of a file is used as a lock, but the file is created in a world-writable directory.",
"Use automated static analysis tools that target this type of weakness. Many modern techniques use data flow analysis to minimize the number of false positives. This is not a perfect solution, since 100% accuracy and coverage are not feasible.",
"Use tools and techniques that require manual (human) analysis, such as penetration testing, threat modeling, and interactive tools that allow the tester to record and modify an active session. These may be more effective than strictly automated techniques. This is especially the case with weaknesses that are related to design and business rules.",
"\n Use monitoring tools that examine the software's process as it interacts with the operating system and the network. This technique is useful in cases when source code is unavailable, if the software was not developed by you, or if you want to verify that the build phase did not introduce any new weaknesses. Examples include debuggers that directly attach to the running process; system-call tracing utilities such as truss (Solaris) and strace (Linux); system activity monitors such as FileMon, RegMon, Process Monitor, and other Sysinternals utilities (Windows); and sniffers and protocol analyzers that monitor network traffic.\n Attach the monitor to the process and look for library functions and system calls that suggest when a search path is being used. One pattern is when the program performs multiple accesses of the same file but in different directories, with repeated failures until the proper filename is found. Library calls such as getenv() or their equivalent can be checked to see if any path-related variables are being accessed.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Fuzz Tester\n Framework-based Fuzzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"Identify error conditions that are not likely to occur during normal usage and trigger them. For example, run the program under low memory conditions, run with insufficient privileges or permissions, interrupt a transaction before it is completed, or disable connectivity to basic network services such as DNS. Monitor the software for any unexpected behavior. If you trigger an unhandled exception or similar error that was discovered and handled by the application's environment, it may still indicate unexpected conditions that were not handled by the application itself.",
"This weakness can be detected using dynamic tools and techniques that interact with the software using large test suites with many diverse inputs, such as fuzz testing (fuzzing), robustness testing, and fault injection. The software's operation may slow down, but it should not become unstable, crash, or generate incorrect results.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Debugger\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Manual Source Code Review (not inspections)\n \n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Origin Analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Source Code Quality Analyzer\n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Binary / Bytecode Quality Analysis\n \n \n Cost effective for partial coverage:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"This weakness can be found easily using static analysis. However in some cases an operator might appear to be incorrect, but is actually correct and reflects unusual logic within the program.",
"This weakness can be found easily using static analysis. However in some cases an operator might appear to be incorrect, but is actually correct and reflects unusual logic within the program.",
"Omission of a break statement might be intentional, in order to support fallthrough. Automated detection methods might therefore be erroneous. Semantic understanding of expected program behavior is required to interpret whether the code is correct.",
"Since this weakness is associated with a code construct, it would be indistinguishable from other errors that produce the same behavior.",
"\n This weakness can be detected using tools and techniques that require manual (human) analysis, such as penetration testing, threat modeling, and interactive tools that allow the tester to record and modify an active session.\n Specifically, manual static analysis is typically required to find the behavior that triggers the download of code, and to determine whether integrity-checking methods are in use.\n ",
"\n Use monitoring tools that examine the software's process as it interacts with the operating system and the network. This technique is useful in cases when source code is unavailable, if the software was not developed by you, or if you want to verify that the build phase did not introduce any new weaknesses. Examples include debuggers that directly attach to the running process; system-call tracing utilities such as truss (Solaris) and strace (Linux); system activity monitors such as FileMon, RegMon, Process Monitor, and other Sysinternals utilities (Windows); and sniffers and protocol analyzers that monitor network traffic.\n Attach the monitor to the process and also sniff the network connection. Trigger features related to product updates or plugin installation, which is likely to force a code download. Monitor when files are downloaded and separately executed, or if they are otherwise read back into the process. Look for evidence of cryptographic library calls that use integrity checking.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Origin Analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Automated Monitored Execution\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n Generated Code Inspection\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n Generated Code Inspection\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Manual Source Code Review (not inspections)\n \n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Automated Monitored Execution\n Forced Path Execution\n Debugger\n Monitored Virtual Environment - run potentially malicious code in sandbox / wrapper / virtual machine, see if it does anything suspicious\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Inter-application Flow Analysis\n Binary / Bytecode simple extractor - strings, ELF readers, etc.\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n Cost effective for partial coverage:\n \n \n Formal Methods / Correct-By-Construction\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Binary / Bytecode Quality Analysis\n Compare binary / bytecode to application permission manifest\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Source Code Quality Analyzer\n \n \n Cost effective for partial coverage:\n \n \n Warning Flags\n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Automated Monitored Execution\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Permission Manifest Analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Manual Source Code Review (not inspections)\n \n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Attack Modeling\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Fuzz Tester\n Framework-based Fuzzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"Since this weakness does not typically appear frequently within a single software package, manual white box techniques may be able to provide sufficient code coverage and reduction of false positives if all potentially-vulnerable operations can be assessed within limited time constraints.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Fuzz Tester\n Framework-based Fuzzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"Whether this issue poses a vulnerability will be subject to the intended behavior of the application. For example, a search engine might intentionally provide redirects to arbitrary URLs.",
"Automated static analysis tools may not be able to determine whether input influences the beginning of a URL, which is important for reducing false positives.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n Binary Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Manual Source Code Review (not inspections)\n \n \n \n ",
"Automated black box tools that supply URLs to every input may be able to spot Location header modifications, but test case coverage is a factor, and custom redirects may not be detected.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"Since these bugs typically introduce incorrect behavior that is obvious to users, they are found quickly, unless they occur in rarely-tested code paths. Managing the correct number of arguments can be made more difficult in cases where format strings are used, or when variable numbers of arguments are supported.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Manual Source Code Review (not inspections)\n \n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Compare binary / bytecode to application permission manifest\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Attack Modeling\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Fuzz Tester\n Framework-based Fuzzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"\n This weakness can be detected using dynamic tools and techniques that interact with the software using large test suites with many diverse inputs, such as fuzz testing (fuzzing), robustness testing, and fault injection. The software's operation may slow down, but it should not become unstable, crash, or generate incorrect results.\n Initialization problems may be detected with a stress-test by calling the software simultaneously from a large number of threads or processes, and look for evidence of any unexpected behavior. The software's operation may slow down, but it should not become unstable, crash, or generate incorrect results.\n ",
"Identify error conditions that are not likely to occur during normal usage and trigger them. For example, run the program under low memory conditions, run with insufficient privileges or permissions, interrupt a transaction before it is completed, or disable connectivity to basic network services such as DNS. Monitor the software for any unexpected behavior. If you trigger an unhandled exception or similar error that was discovered and handled by the application's environment, it may still indicate unexpected conditions that were not handled by the application itself.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Origin Analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n Binary Weakness Analysis - including disassembler + source code weakness analysis\n \n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode Quality Analysis\n Binary / Bytecode simple extractor - strings, ELF readers, etc.\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Manual Source Code Review (not inspections)\n \n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Debugger\n \n \n Cost effective for partial coverage:\n \n \n Monitored Virtual Environment - run potentially malicious code in sandbox / wrapper / virtual machine, see if it does anything suspicious\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n Cost effective for partial coverage:\n \n \n Warning Flags\n Source Code Quality Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n This weakness can be detected using tools and techniques that require manual (human) analysis, such as penetration testing, threat modeling, and interactive tools that allow the tester to record and modify an active session.\n Specifically, manual static analysis is useful for evaluating the correctness of allocation calculations. This can be useful for detecting overflow conditions (CWE-190) or similar weaknesses that might have serious security impacts on the program.\n ",
"While this weakness might be caught by the compiler in some languages, it can occur more frequently in cases in which the called function accepts variable numbers of arguments, such as format strings in C. It also can occur in languages or environments that do not require that functions always be called with the correct number of arguments, such as Perl.",
"This might require an understanding of intended program behavior or design to determine whether the value is incorrect.",
"While this weakness might be caught by the compiler in some languages, it can occur more frequently in cases in which the called function accepts variable numbers of arguments, such as format strings in C. It also can occur in loosely typed languages or environments. This might require an understanding of intended program behavior or design to determine whether the value is incorrect.",
"Code analysis can require knowledge of API behaviors for library functions that might return NULL, reducing the chances of detection when unknown libraries are used.",
"This typically occurs in rarely-triggered error conditions, reducing the chances of detection during black box testing.",
"This issue might not be detected if testing is performed using a web browser, because the browser might obey the redirect and move the user to a different page before the application has produced outputs that indicate something is amiss.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n Formal Methods / Correct-By-Construction\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Manual Source Code Review (not inspections)\n \n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Fault Injection - source code\n Fault Injection - binary\n \n \n Cost effective for partial coverage:\n \n \n Forced Path Execution\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n The external control or influence of filenames can often be detected using automated static analysis that models data flow within the software.\n Automated static analysis might not be able to recognize when proper input validation is being performed, leading to false positives - i.e., warnings that do not have any security consequences or require any code changes.\n ",
"\n Automated dynamic analysis may be effective in detecting permission problems for system resources such as files, directories, shared memory, device interfaces, etc.\n However, since the software's intended security policy might allow loose permissions for certain operations (such as publishing a file on a web server), automated dynamic analysis may produce some false positives - i.e., warnings that do not have any security consequences or require any code changes.\n When custom permissions models are used - such as defining who can read messages in a particular forum in a bulletin board system - these can be difficult to detect using automated dynamic analysis. It may be possible to define custom signatures that identify any custom functions that implement the permission checks and assignments.\n ",
"Manual dynamic analysis may be effective in detecting the use of custom permissions models and functions. The program could then be executed with a focus on exercising code paths that are related to the custom permissions. Then the human analyst could evaluate permission assignments in the context of the intended security model of the software.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Host Application Interface Scanner\n \n \n Cost effective for partial coverage:\n \n \n Fuzz Tester\n Framework-based Fuzzer\n Automated Monitored Execution\n Forced Path Execution\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Inter-application Flow Analysis\n \n \n \n ",
"This weakness can be detected using tools and techniques that require manual (human) analysis, such as penetration testing, threat modeling, and interactive tools that allow the tester to record and modify an active session.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Host-based Vulnerability Scanners - Examine configuration for flaws, verifying that audit mechanisms work, ensure host configuration meets certain predefined criteria\n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Manual Source Code Review (not inspections)\n \n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Configuration Checker\n \n \n \n ",
"Manual static analysis may be effective in detecting the use of custom permissions models and functions. The code could then be examined to identifying usage of the related functions. Then the human analyst could evaluate permission assignments in the context of the intended security model of the software.",
"Fuzzing is not effective in detecting this weakness.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n Automated static analysis may be effective in detecting permission problems for system resources such as files, directories, shared memory, device interfaces, etc. Automated techniques may be able to detect the use of library functions that modify permissions, then analyze function calls for arguments that contain potentially insecure values.\n However, since the software's intended security policy might allow loose permissions for certain operations (such as publishing a file on a web server), automated static analysis may produce some false positives - i.e., warnings that do not have any security consequences or require any code changes.\n When custom permissions models are used - such as defining who can read messages in a particular forum in a bulletin board system - these can be difficult to detect using automated static analysis. It may be possible to define custom signatures that identify any custom functions that implement the permission checks and assignments.\n ",
"\n Use monitoring tools that examine the software's process as it interacts with the operating system and the network. This technique is useful in cases when source code is unavailable, if the software was not developed by you, or if you want to verify that the build phase did not introduce any new weaknesses. Examples include debuggers that directly attach to the running process; system-call tracing utilities such as truss (Solaris) and strace (Linux); system activity monitors such as FileMon, RegMon, Process Monitor, and other Sysinternals utilities (Windows); and sniffers and protocol analyzers that monitor network traffic.\n Attach the monitor to the process and watch for library functions or system calls on OS resources such as files, directories, and shared memory. Examine the arguments to these calls to infer which permissions are being used.\n ",
"This weakness is only detectable using white box methods (see black box detection factor). Careful analysis is required to determine if the code is likely to be removed by the compiler.",
"This specific weakness is impossible to detect using black box methods. While an analyst could examine memory to see that it has not been scrubbed, an analysis of the executable would not be successful. This is because the compiler has already removed the relevant code. Only the source code shows whether the programmer intended to clear the memory or not, so this weakness is indistinguishable from others.",
"Automated static analysis may be useful for detecting unusual conditions involving system resources or common programming idioms, but not for violations of business rules.",
"Identify error conditions that are not likely to occur during normal usage and trigger them. For example, run the program under low memory conditions, run with insufficient privileges or permissions, interrupt a transaction before it is completed, or disable connectivity to basic network services such as DNS. Monitor the software for any unexpected behavior. If you trigger an unhandled exception or similar error that was discovered and handled by the application's environment, it may still indicate unexpected conditions that were not handled by the application itself.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Configuration Checker\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n Binary Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"\n While fuzzing is typically geared toward finding low-level implementation bugs, it can inadvertently find uncontrolled resource allocation problems. This can occur when the fuzzer generates a large number of test cases but does not restart the targeted software in between test cases. If an individual test case produces a crash, but it does not do so reliably, then an inability to limit resource allocation may be the cause.\n When the allocation is directly affected by numeric inputs, then fuzzing may produce indications of this weakness.\n ",
"Certain automated dynamic analysis techniques may be effective in producing side effects of uncontrolled resource allocation problems, especially with resources such as processes, memory, and connections. The technique may involve generating a large number of requests to the software within a short time frame. Manual analysis is likely required to interpret the results.",
"\n Specialized configuration or tuning may be required to train automated tools to recognize this weakness.\n Automated static analysis typically has limited utility in recognizing unlimited allocation problems, except for the missing release of program-independent system resources such as files, sockets, and processes, or unchecked arguments to memory. For system resources, automated static analysis may be able to detect circumstances in which resources are not released after they have expired, or if too much of a resource is requested at once, as can occur with memory. Automated analysis of configuration files may be able to detect settings that do not specify a maximum value.\n Automated static analysis tools will not be appropriate for detecting exhaustion of custom resources, such as an intended security policy in which a bulletin board user is only allowed to make a limited number of posts per day.\n ",
"Manual static analysis can be useful for finding this weakness, but it might not achieve desired code coverage within limited time constraints. If denial-of-service is not considered a significant risk, or if there is strong emphasis on consequences such as code execution, then manual analysis may not focus on this weakness at all.",
"Since this weakness does not typically appear frequently within a single software package, manual white box techniques may be able to provide sufficient code coverage and reduction of false positives if all potentially-vulnerable operations can be assessed within limited time constraints.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"This weakness can be detected using dynamic tools and techniques that interact with the software using large test suites with many diverse inputs, such as fuzz testing (fuzzing), robustness testing, and fault injection. The software's operation may slow down, but it should not become unstable, crash, or generate incorrect results.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Manual Source Code Review (not inspections)\n \n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Fuzz Tester\n Framework-based Fuzzer\n \n \n \n ",
"\n This weakness can often be detected using automated static analysis tools. Many modern tools use data flow analysis or constraint-based techniques to minimize the number of false positives.\n Automated static analysis might not be able to recognize when proper input validation is being performed, leading to false positives - i.e., warnings that do not have any security consequences or require any code changes.\n Automated static analysis might not be able to detect the usage of custom API functions or third-party libraries that indirectly invoke OS commands, leading to false negatives - especially if the API/library code is not available for analysis.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n Binary Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"This weakness can be detected using dynamic tools and techniques that interact with the software using large test suites with many diverse inputs, such as fuzz testing (fuzzing), robustness testing, and fault injection. The software's operation may slow down, but it should not become unstable, crash, or generate incorrect results.",
"\n This weakness can often be detected using automated static analysis tools. Many modern tools use data flow analysis or constraint-based techniques to minimize the number of false positives.\n Automated static analysis generally does not account for environmental considerations when reporting out-of-bounds memory operations. This can make it difficult for users to determine which warnings should be investigated first. For example, an analysis tool might report buffer overflows that originate from command line arguments in a program that is not expected to run with setuid or other special privileges.\n ",
"Use the XSS Cheat Sheet [REF-714] or automated test-generation tools to help launch a wide variety of attacks against your web application. The Cheat Sheet contains many subtle XSS variations that are specifically targeted against weak XSS defenses.",
"Use automated static analysis tools that target this type of weakness. Many modern techniques use data flow analysis to minimize the number of false positives. This is not a perfect solution, since 100% accuracy and coverage are not feasible, especially when multiple components are involved.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Configuration Checker\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"This weakness may be detectable using manual code analysis. Unless authentication is decentralized and applied throughout the software, there can be sufficient time for the analyst to find incoming authentication routines and examine the program logic looking for usage of hard-coded credentials. Configuration files could also be analyzed.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Network Sniffer\n Forced Path Execution\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n Formal Methods / Correct-By-Construction\n \n \n \n ",
"Credential storage in configuration files is findable using black box methods, but the use of hard-coded credentials for an incoming authentication routine typically involves an account that is not visible outside of the code.",
"Automated white box techniques have been published for detecting hard-coded credentials for incoming authentication, but there is some expert disagreement regarding their effectiveness and applicability to a broad range of methods.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n For hard-coded credentials in incoming authentication: use monitoring tools that examine the software's process as it interacts with the operating system and the network. This technique is useful in cases when source code is unavailable, if the software was not developed by you, or if you want to verify that the build phase did not introduce any new weaknesses. Examples include debuggers that directly attach to the running process; system-call tracing utilities such as truss (Solaris) and strace (Linux); system activity monitors such as FileMon, RegMon, Process Monitor, and other Sysinternals utilities (Windows); and sniffers and protocol analyzers that monitor network traffic.\n Attach the monitor to the process and perform a login. Using call trees or similar artifacts from the output, examine the associated behaviors and see if any of them appear to be comparing the input to a fixed string or value.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n Binary Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"Manual analysis can be useful for finding this weakness, but it might not achieve desired code coverage within limited time constraints. This becomes difficult for weaknesses that must be considered for all inputs, since the attack surface can be too large.",
"\n This weakness can often be detected using automated static analysis tools. Many modern tools use data flow analysis or constraint-based techniques to minimize the number of false positives.\n Automated static analysis generally does not account for environmental considerations when reporting out-of-bounds memory operations. This can make it difficult for users to determine which warnings should be investigated first. For example, an analysis tool might report buffer overflows that originate from command line arguments in a program that is not expected to run with setuid or other special privileges.\n ",
"This weakness can be detected using dynamic tools and techniques that interact with the software using large test suites with many diverse inputs, such as fuzz testing (fuzzing), robustness testing, and fault injection. The software's operation may slow down, but it should not become unstable, crash, or generate incorrect results.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"Since this weakness does not typically appear frequently within a single software package, manual white box techniques may be able to provide sufficient code coverage and reduction of false positives if all potentially-vulnerable operations can be assessed within limited time constraints.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Attack Modeling\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Fuzz Tester\n Framework-based Fuzzer\n Monitored Virtual Environment - run potentially malicious code in sandbox / wrapper / virtual machine, see if it does anything suspicious\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n Binary Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Forced Path Execution\n Monitored Virtual Environment - run potentially malicious code in sandbox / wrapper / virtual machine, see if it does anything suspicious\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Manual Source Code Review (not inspections)\n \n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Attack Modeling\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Fuzz Tester\n Framework-based Fuzzer\n Forced Path Execution\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n Automated static analysis is useful for detecting commonly-used idioms for authorization. A tool may be able to analyze related configuration files, such as .htaccess in Apache web servers, or detect the usage of commonly-used authorization libraries.\n Generally, automated static analysis tools have difficulty detecting custom authorization schemes. In addition, the software's design may include some functionality that is accessible to any user and does not require an authorization check; an automated technique that detects the absence of authorization may report false positives.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n Formal Methods / Correct-By-Construction\n \n \n \n ",
"\n This weakness can be detected using tools and techniques that require manual (human) analysis, such as penetration testing, threat modeling, and interactive tools that allow the tester to record and modify an active session.\n Specifically, manual static analysis is useful for evaluating the correctness of custom authorization mechanisms.\n ",
"Automated dynamic analysis may find many or all possible interfaces that do not require authorization, but manual analysis is required to determine if the lack of authorization violates business logic.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Host Application Interface Scanner\n Fuzz Tester\n Framework-based Fuzzer\n \n \n \n ",
"\n This weakness can be detected using tools and techniques that require manual (human) analysis, such as penetration testing, threat modeling, and interactive tools that allow the tester to record and modify an active session.\n Specifically, manual static analysis is useful for evaluating the correctness of custom authorization mechanisms.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Web Application Scanner\n Web Services Scanner\n Database Scanners\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Host Application Interface Scanner\n Fuzz Tester\n Framework-based Fuzzer\n Forced Path Execution\n Monitored Virtual Environment - run potentially malicious code in sandbox / wrapper / virtual machine, see if it does anything suspicious\n \n \n \n ",
"\n Automated static analysis is useful for detecting commonly-used idioms for authorization. A tool may be able to analyze related configuration files, such as .htaccess in Apache web servers, or detect the usage of commonly-used authorization libraries.\n Generally, automated static analysis tools have difficulty detecting custom authorization schemes. Even if they can be customized to recognize these schemes, they might not be able to tell whether the scheme correctly performs the authorization in a way that cannot be bypassed or subverted by an attacker.\n ",
"Automated dynamic analysis may not be able to find interfaces that are protected by authorization checks, even if those checks contain weaknesses.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Fuzz Tester\n Framework-based Fuzzer\n \n \n \n ",
"Manual analysis can be useful for finding this weakness, but it might not achieve desired code coverage within limited time constraints. This becomes difficult for weaknesses that must be considered for all inputs, since the attack surface can be too large.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Manual Source Code Review (not inspections)\n \n \n Cost effective for partial coverage:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n \n \n \n ",
"\n This weakness can often be detected using automated static analysis tools. Many modern tools use data flow analysis or constraint-based techniques to minimize the number of false positives.\n Automated static analysis might not be able to recognize when proper input validation is being performed, leading to false positives - i.e., warnings that do not have any security consequences or do not require any code changes.\n Automated static analysis might not be able to detect the usage of custom API functions or third-party libraries that indirectly invoke SQL commands, leading to false negatives - especially if the API/library code is not available for analysis.\n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n Binary Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Database Scanners\n \n \n Cost effective for partial coverage:\n \n \n Web Application Scanner\n Web Services Scanner\n \n \n \n ",
"This weakness can be detected using dynamic tools and techniques that interact with the software using large test suites with many diverse inputs, such as fuzz testing (fuzzing), robustness testing, and fault injection. The software's operation may slow down, but it should not become unstable, crash, or generate incorrect results.",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Bytecode Weakness Analysis - including disassembler + source code weakness analysis\n Binary Weakness Analysis - including disassembler + source code weakness analysis\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Binary / Bytecode disassembler - then use manual analysis for vulnerabilities & anomalies\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Cost effective for partial coverage:\n \n \n Configuration Checker\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Focused Manual Spotcheck - Focused manual analysis of source\n Manual Source Code Review (not inspections)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Formal Methods / Correct-By-Construction\n \n \n Cost effective for partial coverage:\n \n \n Inspection (IEEE 1028 standard) (can apply to requirements, design, source code, etc.)\n \n \n \n ",
"\n According to SOAR, the following detection techniques may be useful:\n \n Highly cost effective:\n \n \n Source code Weakness Analyzer\n Context-configured Source Code Weakness Analyzer\n \n \n \n ",
"\n The external control or influence of filenames can often be detected using automated static analysis that models data flow within the software.\n Automated static analysis might not be able to recognize when proper input validation is being performed, leading to false positives - i.e., warnings that do not have any security consequences or require any code changes. If the program uses a customized input validation library, then some tools may allow the analyst to create custom signatures to detect usage of those routines.\n ",
"Manual white-box analysis can be very effective for finding this issue, since there is typically a relatively small number of include or require statements in each program."
]