Upload
bennett-carter
View
214
Download
1
Embed Size (px)
Citation preview
CS 5950/6030 Network SecurityClass 13 (F, 9/30/05)
Leszek LilienDepartment of Computer Science
Western Michigan University
Based on Security in Computing. Third Edition by Pfleeger and Pfleeger.Using some slides courtesy of:
Prof. Aaron Striegel — at U. of Notre DameProf. Barbara Endicott-Popovsky and Prof. Deborah Frincke — at U. Washington
Prof. Jussipekka Leiwo — at Vrije Universiteit (Free U.), Amsterdam, The Netherlands
Slides not created by the above authors are © by Leszek T. Lilien, 2005Requests to use original slides for non-profit purposes will be gladly granted upon a written
request.
2
2. Cryptology ...2H. The Uses of Encryption
...2H.4. Certificates
a. Introductionb. Trust Through a Common Respected Individualc. Certificates to Authenticate Identity – PART 1c. Certificates to Authenticate Identity – PART 2d. Trust Without a Single Hierarchy
3. Program Security3.1. Secure Programs – Defining & Testing
a. Introductionb. Judging S/w Security by Fixing Faultsc. Judging S/w Security by Testing Pgm Behaviord. Judging S/w Security by Pgm Security Analysise. Types of Pgm Flaws
3.2. Nonmalicious Program Errorsa. Buffer overflows – PART 1
Class 12
3
[c.- CONT] Certificates to Authenticate Identity (11) Requirements for a certification scheme:
1) Any participant can read Cert to determine X and KPUB-X.
2) Any participant can verify that Cert originated from CA (Certificate Authority) and is not counterfeit.
3) Only CA can create and update Cert.4) Any participant can verify the currency of Cert.
To this end, each Cert must include a timestamp
(not discussed by us).
Q: Does our scheme satify these requirements?
[cf. Stallings - „Cryptography and Network Security”
4
d. Trust Without a Single Hierarchy (1)
Certificate structure relies on TTP at the top of certificate hierarchy
TTP is not certified by anybody! Must be very trustworthy
...
Different people, applications, etc., can and do use different TTPs (CAs) for certification
5
3. Program Security (1) Program security –
Our first step on how to apply security to computing
Protecting programs is the heart of computer security All kinds of programs, from apps via OS, DBMS, networks
Issues: How to keep pgms free from flaws How to protect computing resources from pgms
with flaws
Issues of trust not considered: How trustworthy is a pgm you buy? How to use it in its most secure way?
Partial answers: Third-party evaluations Liability and s/w warranties
6
3.1. Secure Programs – Defining & Testing
Outlinea. Introductionb. Judging S/w Security by Fixing Faultsc. Judging S/w Security by Testing Pgm Behaviord. Judging S/w Security by Pgm Security Analysise. Types of Pgm Flaws
[cf. B. Endicott-Popovsky]
7
e. Types of Pgm Flaws Taxonomy of pgm flaws:
Intentional Malicious Nonmalicious
Inadvertent Validation error (incomplete or inconsistent)
e.g., incomplete or inconsistent input data Domain error
e.g., using a variable value outside of its domain Serialization and aliasing
serialization – e.g., in DBMSs or OSs aliasing - one variable or some reference, when changed,
has an indirect (usually unexpected) effect on some other data
Inadequate ID and authentication (Section 4—on OSs)
Boundary condition violation Other exploitable logic errors[cf. B. Endicott-Popovsky]
8
3.2. Nonmalicious Program Errors Outline
a. Buffer overflowsb. Incomplete mediationc. Time-of-check to time-of-use errorsd. Combinations of nonmalicious program flaws
9
a. Buffer Overflows (1) Buffer overflow flaw — often inadvertent
(=>nonmalicious) but with serious security consequences
Many languages require buffer size declaration C language statement: char sample[10]; Execute statement: sample[i] = ‘A’;
where i=10 Out of bounds (0-9) subscript – buffer overflow
occurs Some compilers don’t check for exceeding bounds
C does not perform array bounds checking. Similar problem caused by pointers
No reasonable way to define limits for pointers
[cf. B. Endicott-Popovsky]
10
Buffer Overflows (2)
Where does ‘A’ go? Depends on what is adjacent to ‘sample[10]’
Affects user’s data - overwrites user’s data
Affects users code - changes user’s instruction
Affects OS data - overwrites OS data Affects OS code - changes OS
instruction
This is a case of aliasing (cf. Slide 7)
[cf. B. Endicott-Popovsky]
11
Buffer Overflows (3a) Implications of buffer overflow:
Attacker can insert malicious data values/instruction codes into „overflow space”
[cf. B. Endicott-Popovsky]
12
End of Class 12
13
2. Cryptology ...2H. The Uses of Encryption
...2H.4. Certificates
...
c. Certificates to Authenticate Identity – PART 2d. Trust Without a Single Hierarchy
3. Program Security3.1. Secure Programs – Defining & Testing
a. Introductionb. Judging S/w Security by Fixing Faultsc. Judging S/w Security by Testing Pgm Behaviord. Judging S/w Security by Pgm Security Analysise. Types of Pgm Flaws
3.2. Nonmalicious Program Errorsa. Buffer overflows – PART 1a. Buffer overflows – PART 2b. Incomplete mediationc. Time-of-check to time-of-use errorsd. Combinations of nonmalicious program flaws
Class 12
Class13
14
3.3. Malicious Code
3.3.1. General-Purpose Malicious Code (incl. Viruses)
a. Introduction b. Kinds of Malicious Codec. How Viruses Work – PART 1
15
Buffer Overflows (3b) Implications of buffer overflow:
Attacker can insert malicious data values/instruction codes into „overflow space”
Supp. buffer overflow affects OS code area Attacker code executed as if it were OS code
Attacker might need to experiment to see what happens when he inserts A into OS code area
Can raise attacker’s privileges (to OS privilege level)
When A is an appropriate instruction Attacker can gain full control of OS
[cf. B. Endicott-Popovsky]
16
Buffer Overflows (4)
Supp. buffer overflow affects a call stack areaA scenario:
Stack: [data][data][...] Pgm executes a subroutine
=> return address pushed onto stack (so subroutine knows where to return control to when finished)Stack: [ret_addr][data][data][...]
Subroutine allocates dynamic buffer char sample[10] => buffer (10 empty spaces) pushed onto stackStack: [..........][ret_addr][data][data][...]
Subroutine executes: sample[i] = ‘A’ for i = 10Stack: [..........][A][data][data][...]
Note: ret_address overwritten by A!(Assumed: size of ret_address is 1 char)
17
Buffer Overflows (5) Supp. buffer overflow affects a call stack area—CONT
Stack: [..........][A][data][data][...] Subroutine finishes
Buffer for char sample[10] is deallocatedStack: [A][data][data][...]
RET operation pops A from stack (considers it ret. addr.)Stack: [data][data][...]
Pgm (which called the subroutine) jumps to A=> shifts program control to where attacker wanted
Note: By playing with ones own pgm attacker can specify any „return address” for his subroutine
Upon subroutine return, pgm transfers control to attacker’s chosen address A (even in OS area)
Next instruction executed is the one at address A Could be 1st instruction of pgm that grants
highest access privileges to its „executor”
18
Buffer Overflows (6) Note: [Wikipedia – aliasing]
C programming language specifications do not specify how data is to be laid out in memory (incl. stack layout)
Some implementations of C may leave space between arrays and variables on the stack, for instance, to minimize possible aliasing effects.
19
Buffer Overflows (7) Web server attack similar to buffer overflow attack:
pass very long string to web server (details: textbook, p.103)
Buffer overflows still common Used by attackers
to crash systems to exploit systems by taking over control
Large # of vulnerabilities due to buffer overflows
20
b. Incomplete Mediation (1) Incomplete mediation flaw — often inadvertent (=>
nonmalicious) but with serious security consequences Incomplete mediation:
Sensitive data are in exposed, uncontrolled condition
Example URL to be generated by client’s browser to access
server, e.g.:http://www.things.com/order/final&custID=101&part=555A&qy=20&price=10&ship=boat&shipcost=5&total=205
Instead, user edits URL directly, changing price and total cost as follows: http://www.things.com/order/final&custID=101&part=555A&qy=20&price=1&ship=boat&shipcost=5&total=25
User uses forged URL to access server The server takes 25 as the total cost
21
Incomplete Mediation (2)
Unchecked data are a serious vulnerability!
Possible solution: anticipate problems Don’t let client return a sensitive result (like
total) that can be easily recomputed by server
Use drop-down boxes / choice lists for data input Prevent user from editing input directly
Check validity of data values received from client
22
c. Time-of-check to Time-of-use Errors (1)
Time-of-check to time-of-use flaw — often inadvertent (=> nonmalicious) but with serious security consequences
A.k.a. synchronization flaw / serialization flaw TOCTTOU — mediation with “bait and switch” in the
middle Non-computing example:
Swindler shows buyer real Rolex watch (bait) After buyer pays, switches real Rolex to a forged one
In computing: Change of a resource (e.g., data) between time
access checked and time access used Q: Any examples of TOCTTOU
problems fromcomputing?
23
Time-of-check to Time-of-use Errors (2) ... TOCTTOU — mediation with “bait and switch” in the
middle ...
Q: Any examples of TOCTTOU problems from
computing? A: E.g., DBMS/OS: serialization problem:
pgm1 reads value of X = 10pgm1 adds X = X+ 5
pgm2 reads X = 10, adds 3 to X, writes X = 13
pgm1 writes X = 15
X ends up with value 15 – should be X = 18
24
Time-of-check to Time-of-use Errors (3)
Prevention of TOCTTOU errors Be aware of time lags Use digital signatures and certificates to „lock”
data values after checking them So nobody can modify them after check &
before use Q: Any examples of preventing
TOCTTOU fromDBMS/OS areas?
25
Time-of-check to Time-of-use Errors (4)
Prevention of TOCTTOU errors ... Q: Any examples of preventing TOCTTOU
fromDBMS/OS areas?
A1: E.g., DBMS: locking to enforce proper serialization(locks need not use signatures—fully controlled by DBMS) In the previous example:
will force writing X = 15 by pgm 1, before pgm2
reads X (so pgm 2 adds 3 to 15) OR:
will force writing X = 13 by pgm 2, before pgm1
reads X (so pgm 1 adds 5 to 13)
A2: E.g., DBMS/OS: any other concurrency control mechanism enforcing serializability
26
d. Combinations of Nonmal. Pgm Flaws
The above flaws can be exploited in multiple steps by a concerted attack
Nonmalicious flaws can be exploited to plant malicious flaws (next)
27
3.3. Malicious Code Malicious code or rogue pgm is written to exploit flaws in pgms
Malicious code can do anything a pgm can Malicious code can change
data other programs
Malicious code has been „oficially” defined by Cohen in 1984 but virus behavior known since at least 1970 Ware’s study for Defense Science Board (classified, made public in 1979)
Outline for this Subsection:3.3.1. General-Purpose Malicious Code (incl.
Viruses)3.3.2. Targeted Malicious Code
28
3.3.1. General-Purpose Malicious Code (incl. Viruses)
Outlinea. Introductionb. Kinds of Malicious Codec. How Viruses Workd. Virus Signaturese. Preventing Virus Infectionsf. Seven Truths About Virusesg. Case Studies
[cf. B. Endicott-Popovsky]
29
a. Introduction Viruses are prominent example of general-purpose
malicious code Not „targeted” against any user Attacks anybody with a given app/system/config/...
Viruses Many kinds and varieties Benign or harmful Transferred even from trusted sources Also from „trusted” sources that are negligent to
update antiviral programs and check for viruses
[cf. B. Endicott-Popovsky]
30
b. Kinds of Malicious Code (1) Trojan horse - A computer program that appears to
have a useful function, but also has a hidden and potentially malicious function that evades security mechanisms, sometimes by exploiting legitimate authorizations of a system entity that invokes the program
Virus - A hidden, self-replicating section of computer software, usually malicious logic, that propagates by infecting (i.e., inserting a copy of itself into and becoming part of) another program. A virus cannot run by itself; it requires that its host program be run to make the virus active.
Worm - A computer program that can run independently, can propagate a complete working version of itself onto other hosts on a network, and may consume computer resources destructively.
31
Kinds of Malicious Code (2)
Bacterium - A specialized form of virus which does not attach to a specific file. Usage obscure.
Logic bomb - Malicious [program] logic that activates when specified conditions are met. Usually intended to cause denial of service or otherwise damage system resources.
Time bomb - activates when specified time occurs Rabbit – A virus or worm that replicates itself without
limit to exhaust resource
Trapdoor / backdoor - A hidden computer flaw known to an intruder, or a hidden computer mechanism (usually software) installed by an intruder, who can activate the trap door to gain access to the computer without being blocked by security services or mechanisms.
32
Kinds of Malicious Code (3)
Above terms not always used consistently, esp. in popular press
Combinations of the above kinds even more confusing
E.g., virus can be a time bomb— spreads like virus, „explodes” when time occurs
Term „virus” often used to refer to any kind of malicious code
When discussing malicious code, we’ll often say „virus” for any malicious code
33
c. How Viruses Work (1) Pgm containing virus must be executed to spread virus
or infect other pgms Even one pgm execution suffices to spread virus
widely
Virus actions: spread / infect
Spreading – Example 1: Virus in a pgm on installation CD User activates pgm contaning virus when she runs
INSTALL or SETUP Virus installs itself in any/all executing pgms present
in memory Virus installs itself in pgms on hard disk
From now on virus spreads whenever any of the infected pgms (from memory or hard disk) executes
34
How Viruses Work (2)
Spreading – Example 2: Virus in attachment to e-mail msg User activates pgm contaning virus (e.g. macro in
MS Word) by just opening the attachment => Disable automatic opening of
attachments!!! Virus installs itself and spreads ... as in Example 1...
Spreading – Example 3: Virus in downloaded file File with pgm or document (.doc, .xls, .ppt, etc.) You know the rest by now...
Document virus Spreads via picture, document, spreadsheet, slide
presentation, database, ... E.g., via .jpg, via MS Office documents .doc, .xls, .ppt, .mdb
Currently most common!
35
How Viruses Work (3) Kinds of viruses w.r.t. way of attaching to infected
pgms1) Appended viruses
Appends to pgm Most often virus code precedes pgm code
Inserts its code before the 1st pgm instruction in executable pgm file
Executes whenever program executed2) Surrounding viruses
Surronds program Executes before and after infected program
Intercepts its input/output Erases its tracks
The „after” part might be used to mask virus existenceE.g. if surrounds „ls”, the „after” part removes listing of virus file produced by „ls” so user can’t see it
... cont. ...
36
How Viruses Work (4)... cont. ...
3) Integrating viruses Integrates into pgm code
Spread within infected pgms
4) Replacing viruses Entirely replaces code of infected pgm file
37
How Viruses Work (5)
(Replacing) virus V gains control over target pgm T by: Overwriting T on hard disk
OR Changing pointer to T with pointer to V (textbook,
Fig. 3-7) OS has File Directory File Directory has an entry that points to file with code for
T Virus replaces pointer to T’s file with pointer to V’s file
In both cases actions of V replace actions of T when user executes what she thinks is „T”
38
End of Class 13