포렌식
2011.03.19 12:58

Digital Forensic 의 정의

조회 수 102445 추천 수 0 댓글 0
?

단축키

Prev이전 문서

Next다음 문서

크게 작게 위로 아래로 댓글로 가기 인쇄 수정 삭제
?

단축키

Prev이전 문서

Next다음 문서

크게 작게 위로 아래로 댓글로 가기 인쇄 수정 삭제

Digital forensics

Digital forensics (sometimes Digital forensic science) is a branch of forensic science encompassing the recovery and investigation of material found in digital devices, often in relation to computer crime.[1][2] The term digital forensics was originally used as a synonym for computer forensics but has expanded to cover all devices capable of storing digital data and is now used to describe the entire field.[1] The discipline evolved in a haphazard manner during the 1990s and it was not until the early 2000s that national policies were created.

Investigations can fall into four categories. The most common category is forensic analysis, where evidence is recovered to support or oppose a hypothesis before a criminal court, this is closely related to intelligence gathering, where material is intended to identify other suspects/crimes. eDiscovery is a form of discovery related to civil litigation and intrusion investigation is a specialist investigation into the nature and extent of an unauthorized network intrusion. The technical side of investigations is divided into several sub-branches; computer forensics, network forensics, database forensics and mobile device forensics. Any number of the fields may be utilised in an investigation.

As well as identifying direct evidence of a crime, digital forensics can be used to attribute evidence to specific suspects, confirm alibis or statements, determine intent, identify sources (for example, in copyright cases) or authenticate documents.[3] Investigations are much broader in scope than other areas of forensic analysis (where the usual aim is to provide answers to a series of simpler questions) often involving complex time-lines or hypothesis.[4]

The digital forensic process encompasses the seizure, forensic imaging (acquisition) and analysis of digital media. Finally producing a report of the digital evidence for the courts or an employer. Computer devices tend to store large amounts of information in cache/log files and deleted space and forensic examiners can recover this data as part of the analysis process.

Contents

History

Aerial photo of FLETC, where US digital forensics standards were developed in the 80s and 90s

Prior to the 1980s crimes involving computers were dealt with using existing laws. The first computer crimes were recognized in the 1978 Florida Computer Crimes Act, which included legislation against the unauthorized modification or deletion of data on a computer system.[5][6] Over the next few years the range of computer crimes being committed increased, and laws were passed to deal with issues of copyright, privacy/harassment (e.g., cyber bullying, cyber stalking, and online predators) and child pornography.[7][8] It was not until the 1980s that federal laws began to incorporate computer offences. Canada was the first country to pass legislation in 1983.[6] This was followed by the US Federal Computer Fraud and Abuse Act in 1986, Australian amendments to their crimes acts in 1989 and the British Computer Abuse Act in 1990.[6][8]

1980s-1990s: Growth of the field

The growth in computer crime during the 1980s and 1990s caused law enforcement agencies to begin establishing specialized groups, usually at the national level, to handle the technical aspects of investigations. For example, in 1984 the FBI launched a Computer Analysis and Response Team and the following year a computer crime department was set up within the British Metropolitan Police fraud squad. Besides being law enforcement professionals many of the early members of these groups were also computer hobbyists and became responsible for the fields initial research and direction.[9][10]

One of the first practical (or at least publicised) examples of digital forensics was Cliff Stoll's pursuit of hacker Markus Hess in 1986. Stoll, whose investigation made use of both computer and network forensic techniques, was not a specialised examiner.[11] Many of the earliest forensic examinations followed the same profile.[12]

Throughout the 1990s there was high demand for the these new, and basic, investigative resources. The strain on central units lead to the creation of regional, and even local, level groups to help handle the load. For example, the British National Hi-Tech Crime Unit was set up in 2001 to provide a national infrastructure for computer crime; with personnel located both centrally in London and with the various regional police forces (the unit was folded into the Serious Organised Crime Agency (SOCA) in 2006).[10]

During this period the science of digital forensics grew out of ad-hoc tools and techniques developed by these hobbyist practitioners. This is in contrast to other forensics disciplines which developed from work by the scientific community.[1][13] It was not until 1992 that the term "computer forensics" was used in academic literature (although prior to this it had been in informal use); a paper by Collier and Spaul attempted to justify this new discipline to the forensic science world.[14][15] This swift development resulted in a lack of standardization and training. In his 1995 book, "High-Technology Crime: Investigating Cases Involving Computers", K Rosenblatt wrote:[6]

Seizing, preserving, and analyzing evidence stored on a computer is the greatest forensic challenge facing law enforcement in the 1990s. Although most forensic tests, such as fingerprinting and DNA testing, are performed by specially trained experts the task of collecting and analyzing computer evidence is often assigned to patrol officers and detectives[16]

2000s: Developing standards

Since 2000, in response to the need for standardization, various bodies and agencies have published guidelines for digital forensics. The Scientific Working Group on Digital Evidence (SWGDE) produced a 2002 paper, "Best practices for Computer Forensics", this was followed, in 2005, by the publication of an ISO standard (ISO 17025, General requirements for the competence of testing and calibration laboratories).[17][6][18] A European lead international treaty, the Convention on Cybercrime, came into force in 2004 with the aim of reconciling national computer crime laws, investigative techniques and international co-operation. The treaty has been signed by 43 nations (including the US, Canada, Japan, South Africa, UK and other European nations) and ratified by 16.

The issue of training also received attention. Commercial companies (often forensic software developers) began to offer certification programs and digital forensic analysis was included as a topic at the UK specialist investigator training facility, Centrex.[6][10]

Since the late 1990s mobile devices have become more widely available, advancing beyond simple communication devices, and have been found to be be rich forms of information, even for crime not traditionally associated with digital forensics.[19] Despite this, digital analysis of phones has lagged behind traditional computer media, largely due to problems over the proprietary nature of devices.[20]

Focus has also shifted onto internet crime, particularly the risk of cyber warfare and cyberterrorism. A February 2010 report by the U.S. Joint Forces Command concluded:

Through cyberspace, enemies will target industry, academia, government, as well as the military in the air, land, maritime, and space domains. In much the same way that airpower transformed the battlefield of World War II, cyberspace has fractured the physical barriers that shield a nation from attacks on its commerce and communication.[21]

The field of digital forensics still faces outstanding issues. A 2009 paper, "Digital Forensic Research: The Good, the Bad and the Unaddressed", by Peterson and Shenoi identified a bias towards Windows operating systems in digital forensics research.[22] In 2010 Simson Garfinkel identified issues facing digital investigations in the future; including the increasing size of digital media, the wide availability of encryption to consumers, a growing variety of operating systems and file formats, an increasing number of individuals owning multiple devices and legal limitations on investigators. The paper also identified continued training issues, as well as the prohibitively high cost of entering the field.[11]

Investigative tools

During the 1980s very few specialised digital forensic tools existed and investigators often performed live analysis on media, examining computers from within the operating system using existing sysadmin tools to extract evidence. This risked modifying data on the disk (inadvertently or otherwise) leading to claims of evidence tampering. In the early 1990s a number of tools were created to allow investigations to take place without the risk of altering data.

The need for such software was first recognised in 1989 at the Federal Law Enforcement Training Center, resulting in the creation of IMDUMP (by Michael White) and in 1990, SafeBack (developed by Sydex). Similar pieces of software were produced in other countries; DIBS (a hardware and software solution) was released commercially in the UK in 1991 and Rob McKemmish released Fixed Disk Image free to Australian law enforcement.[9] These tools allowed examiners to create an exact copy of a piece of digital media to work on; leaving the original disk intact for verification. By the end of the 90s, as demand for digital evidence grew more advanced, commercial tools (EnCase, FTK, etc.) were developed that allowed analysts to examine copies of media without using any live forensics.[6]

More recently the same progression of tool development has occurred for mobile devices; initially investigators accessed data directly on the device, these were soon replaced with specialist tools (such as XRY or Radio Tactics Aceso).[6]

Forensic Process

A portable Tableau write-blocker attached to a Hard Drive

A digital forensic investigation commonly consists of 3 stages; acquisition or imaging of exhibits, analysis and reporting.[6][23] Acquisition involves creating an exact sector level duplicate (or "forensic duplicate") of the media, often using a write blocking device to prevent modification of the original. Both acquired image and original media are hashed (using SHA-1 or MD5) and the values compared to verify the copy is accurate.[24]

During the analysis phase an investigators recover evidence material using a number of different methodologies and tools. In 2002 and article in the International Journal of Digital Evidence referred to this step as "an in-depth systematic search of evidence related to the suspected crime".[1] In 2006, forensics researcher Brian Carrie described an "intuitive procedure" in which obvious evidence is first identified and then "exhaustive searches are conducted to start filling in the holes."[4]

The actual process of analysis can vary between investigations, but common methodologies include conducting keyword searches across the digital media (within files as well as unallocated and slack space), recovering deleted files and extraction of registry information (for example to list user accounts, or attached USB devices).

Once evidence is recovered it is analysed to reconstruct events or actions and to reach conclusions, work that can often be performed by less specialist staff.[1] When an investigation is complete the data is presented, usually in the form of a written report, in lay persons terms.[1]

Forms and uses

An example of an image's Exif metadata that might be used to prove its origin

Investigations can take one of four forms. Firstly as a forensic examination, traditionally associated with criminal law, where evidence is collected to support or oppose a hypothesis. Like other areas of forensics this is often part of a wider investigation spanning a number of disciplines. A related form is "intelligence gathering", functionally identical to a forensic examination the digital evidence is intended to be used as intelligence to (for example) locate, identify or halt other crimes. As a result intelligence gathering is sometimes held to a less strict forensic standard.

In civil litigation or corporate matters the process is referred to as electronic discovery (or eDiscovery). The forensic procedure is similar to that used in criminal investigations but with different legal requirements and limitations. Finally, intrusion investigation is a specialist examination into the nature and extent of an unauthorized network intrusion. Intrusion analysis is usually performed as a damage limitation exercise after an attack, both to establish the extent of any intrusion and to try and identify the attacker.[4][3] Such attacks were commonly conducted over phone lines during the 1980s, but in the modern era are usually propagated over the internet.[25]

The main use of digital forensics is to recover objective evidence of a criminal activity (termed actus reus in legal parlance). However the diverse range of data held in digital devices can help with other areas of investigation.[3]

Attribution
Meta data and other logs can be used to attribute actions to an individual. For example, personal documents on a computer drive might identify its owner.
Alibis and statements
Information provided by those involved can be cross checked with digital evidence. For example, during the investigation into the Soham murders, the offenders alibi was disproven when mobile phone records of the person he claimed to be with showed she was out of town at the time.
Intent
As well as finding objective evidence of a crime being committed, investigations can also be used to prove the intent (known by the legal term mens rea). For example, the Internet history of convicted killer Neil Entwistle included references to a site discussing How to kill people.
Evaluation of source
File artifacts and meta-data can be used to identify the origin of a particular piece of data; for example, older versions of Microsoft Word embedded a Global Unique Identifer into files which identified the computer it had been created on. Proving whether a file was produced on the digital device being examined or obtained from elsewhere (e.g., the Internet) can be very important.[3]
Document authentication
Related to "Evaluation of Source", meta data associated with digital documents can be easily modified (for example, by changing the computer clock you can affect the creation date of a file). Document authentication relates to detecting and identifying falsification of such details.

Limitations

One major limitation to a forensic investigation is the use of encryption; this disrupts initial examination where pertinent evidence might be located using keywords. Laws to compel individuals to disclose encryption keys are still relatively new and controversial.[11]

Legal considerations

The examination of digital media is covered by national and international legislation. For civil investigations, in particular, laws may restrict the abilities of analysts to undertake examinations. Restrictions against network monitoring, or reading of personal communications often exist.[26] For criminal investigation, national laws restrict how much information can be seized.[26] For example, in the United Kingdom seizure of evidence by law enforcement is governed by the PACE act.[6] The "International Organization on Computer Evidence" (IOCE) is one agency that works to establish compatible international standards for the seizure of evidence.[27]

In the UK the same laws covering computer crime can also affect forensic investigators. The 1990 computer misuse act legislates against unauthorised access to computer material; this is a particular concern for civil investigators who have more limitations than law enforcement.

An individuals right to privacy is one area of digital forensics which is still largely undecided by courts. The US Electronic Communications Privacy Act places limitations on the ability of law enforcement or civil investigators to intercept and access evidence. The act makes a distinction between stored communication (e.g. email archives) and transmitted communication (such as VOIP). The latter, being considered more of a privacy invasion, is harder to obtain a warrant for.[6][16] The ECPA also affects the ability of companies to investigate the computers and communications of their employees, an aspect that is still under debate as to the extent to which a company can perform such monitoring.[6]

In Europe Article 5 of the European Convention on Human Rights asserts similar privacy limitations to the ECPA. In addition it also limits the processing and sharing of personal data both within the EU and with external countries. In the UK the ability of law enforcement to conduct digital forensics investigations is legislated by the Regulation of Investigatory Powers Act.[6]

Digital evidence

Digital evidence can come in a number of forms

Where it will be used in a court of law, digital evidence falls under the same legal guidelines as other forms of evidence, courts do not usually require more stringent guidelines.[6][28] In the United States the Federal Rules of Evidence are used to evaluate the admissibility of digital evidence, the United Kingdom PACE and Civil Evidence acts have similar guidelines and many other countries have their own laws. US federal laws restrict seizures to items with only obvious evidential value. This is acknowledged as not always being possible to establish with digital media prior to an examination.[26]

Laws dealing with digital evidence refer to two considerations; integrity and authenticity. Integrity is ensuring that the act of seizing and acquiring digital media does not modify the evidence (either the original or the copy). Authenticity refers to the ability to confirm the integrity of information; for example that the imaged media matches the original evidence.[26] The ease with which digital media can be modified means that documenting the chain of custody from the crime scene, through analysis and, ultimately, to the court, (a form of audit trail) is important to establish the authenticity of evidence.[6]

Attorneys have argued that because digital evidence can theoretically be altered it undermines the reliability of the evidence. US judges are beginning to reject this theory, in the case US v. Bonallo the court ruled that "the fact that it is possible to alter data contained in a computer is plainly insufficient to establish untrustworthiness".[6][29] In the United Kingdom guidelines such as those issued by ACPO are followed to help document the authenticity and integrity of evidence.

Digital investigators, particularly in criminal investigations, have to ensure that conclusions are based upon factual evidence and their own expert knowledge.[6] In the US, for example, Federal Rules of Evidence state that a qualified expert may testify “in the form of an opinion or otherwise” so long as:

(1) the testimony is based upon sufficient facts or data, (2) the testimony is the product of reliable principles and methods, and (3) the witness has applied the principles and methods reliably to the facts of the case.[30]

Many of the sub-branches of digital forensics have their own specific guidelines for handling and investigating evidence. For example, mobile phones are often acquired inside a Faraday shield to stop radio traffic to the device. Or, in the UK forensic examination of computers in criminal matters is subject to ACPO guidelines.[6]

Branches

Digital forensics includes several sub-branches relating to the investigation of various types of devices, media or artefacts.

Computer forensics

The goal of computer forensics is to explain the current state of a digital artifact; such as a computer system, storage medium or electronic document.[31] The discipline usually covers computers, embedded systems (digital devices with rudimentary computing power and onboard memory) and static memory (such as USB pen drives).

Computer forensics can deal with a broad range of information; from logs (such as internet history) through to the actual files on the drive. In 2007 prosecutors used a spreadsheet recovered from the computer of Joseph E. Duncan III to show premeditation and secure the death penalty.[3] Sharon Lopatka's killer was identified in 2006 after email messages from him detailing torture and death fantasies were found on her computer.[6]

Mobile phones in a UK Evidence bag

Mobile device forensics

Mobile device forensics is a sub-branch of digital forensics relating to recovery of digital evidence or data from a mobile device. It differs from Computer forensics in that a mobile device will have an inbuilt communication system (e.g. GSM) and, usually, proprietary storage mechanisms. Investigations usually focus on simple data such as call data and communications (SMS/Email) rather than in-depth recovery of deleted data.[6][32] SMS data from a mobile device investigation helped to exonerate Patrick Lumumba in the murder of Meredith Kercher.[3]

Mobile devices are also useful for providing location information; either from inbuilt gps/location tracking or via cell site logs (which track the devices within their range). Such information was used to track down the kidnappers of Thomas Onofri in 2006.[3]

Network forensics

Network forensics relates to the monitoring and analysis of computer network (both local network and WAN/internet) traffic for the purposes of information gathering, legal evidence or intrusion detection.[33] Traffic is intercepted (usually at the packet level) and either stored for later analysis with specialist tools or filtered in real time for relevant information. Unlike other areas of digital forensics, network data is often volatile and seldom logged, making the discipline often reactionary.

In 2000, the FBI lured computer hackers Aleksey Ivanov and Gorshkov to the United States for a fake job interview. By monitoring network traffic from the pair's computers, the FBI identified passwords that let it collect evidence directly from Russia-based computers.[6][34]

Database forensics

Database forensics is a branch of digital forensics relating to the forensic study of databases and their metadata.[35] Investigations use database contents, log files and in-RAM data in order to build a time-line or recover relevant information.

See also

Related journals

References

  1. ^ a b c d e f M Reith, C Carr, G Gunsch (2002). "An examination of digital forensic models". International Journal of Digital Evidence. http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.13.9683. Retrieved 2 August 2010. 
  2. ^ Carrier, B (2001). "Defining digital forensic examination and analysis tools". Digital Research Workshop II. http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.14.8953. Retrieved 2 August 2010. 
  3. ^ a b c d e f g Various (2009). Eoghan Casey. ed. Handbook of Digital Forensics and Investigation. Academic Press. p. 567. ISBN 0123742676. http://books.google.co.uk/books?id=xNjsDprqtUYC. Retrieved 27 August 2010. 
  4. ^ a b c Carrier, Brian D (07). "Basic Digital Forensic Investigation Concepts". http://www.digital-evidence.org/di_basics.html. 
  5. ^ "Florida Computer Crimes Act". http://www.clas.ufl.edu/docs/flcrimes/chapter2_1.html. Retrieved 31 August 2010. 
  6. ^ a b c d e f g h i j k l m n o p q r s t u Casey, Eoghan (2004). Digital Evidence and Computer Crime, Second Edition. Elsevier. ISBN 0-12-163104-4. http://books.google.co.uk/books?id=Xo8GMt_AbQsC. 
  7. ^ Aaron Phillip; David Cowen, Chris Davis (2009). Hacking Exposed: Computer Forensics. McGraw Hill Professional. p. 544. ISBN 0071626778. http://books.google.co.uk/books?id=yMdNrgSBUq0C. Retrieved 27 August 2010. 
  8. ^ a b M, M. E.. "A Brief History of Computer Crime: A". Norwich University. http://www.mekabay.com/overviews/history.pdf. Retrieved 30 August 2010. 
  9. ^ a b Mohay, George M. (2003). Computer and intrusion forensics. Artechhouse. p. 395. ISBN 1580533698. 
  10. ^ a b c Peter Sommer (January 2004). "The future for the policing of cybercrime". Computer Fraud & Security 2004 (1): 8–12. doi:10.1016/S1361-3723(04)00017-X. ISSN 1361-3723. 
  11. ^ a b c Simson L. Garfinkel (August 2010). "Digital forensics research: The next 10 years". Digital Investigation 7: S64-S73. doi:10.1016/j.diin.2010.05.009. ISSN 1742-2876. 
  12. ^ Linda Volonino, Reynaldo Anzaldua (2008). Computer forensics for dummies. For Dummies. pp. 384. ISBN 0470371919. 
  13. ^ GL Palmer, I Scientist, H View (2002). "Forensic analysis in the digital world". International Journal of Digital Evidence. https://utica.edu/academic/institutes/ecii/publications/articles/9C4E938F-E3BE-8D16-45D0BAD68CDBE77.doc. Retrieved 2 August 2010. 
  14. ^ Wilding, E. (1997). Computer Evidence: a Forensic Investigations Handbook. London: Sweet & Maxwell. p. 236. ISBN 0421579900. 
  15. ^ Collier, P.A. and Spaul, B.J. (1992). "A forensic methodology for countering computer crime". Computers and Law (Intellect Books). 
  16. ^ a b K S Rosenblatt (1995). High-Technology Crime: Investigating Cases Involving Computers. KSK Publications. ISBN 0-9648171-0-1. http://www.ncjrs.gov/App/abstractdb/AbstractDBDetails.aspx?id=175264. Retrieved 4 August 2010. 
  17. ^ "Best practices for Computer Forensics". SWGDE. Archived from the original on 3 October 2010. http://www.swgde.org/documents/archived-documents/2004-11-15%20SWGDE%20Best%20Practices%20for%20Computer%20Forensics%20v1.0.pdf. Retrieved 4 August 2010. 
  18. ^ "ISO/IEC 17025:2005". ISO. http://www.iso.org/iso/catalogue_detail.htm?csnumber=39883. Retrieved 20 August 2010. 
  19. ^ SG Punja (2008). "Mobile device analysis". Small Scale Digital Device Forensics Journal. http://www.ssddfj.org/papers/SSDDFJ_V2_1_Punja_Mislan.pdf. 
  20. ^ R Ahmed (2008). "Mobile forensics: an overview, tools, future trends and challenges from law enforcement perspective". 6th International Conference on E-Governance. http://www.iceg.net/2008/books/2/34_312-323.pdf. 
  21. ^ "The Joint Operating Environment", Report released, Feb. 18, 2010, pp. 34–36
  22. ^ Peterson, Gilbert & Shenoi, Sujeet (2009). "Digital Forensic Research: The Good, the Bad and the Unaddressed". Advances in Digital Forensics V (Springer Boston) 306: 17-36. doi:10.1007/978-3-642-04155-6_2. 
  23. ^ "'Electronic Crime Scene Investigation Guide: A Guide for First Responders". National Institute of Justice. 2001. http://www.ncjrs.gov/pdffiles1/nij/187736.pdf. 
  24. ^ Maarten Van Horenbeeck (24). "Technology Crime Investigation". Archived from the original on 17 May 2008. http://web.archive.org/web/20080517022757/http://www.daemon.be/maarten/forensics.html#dr. Retrieved 17 August 2010. 
  25. ^ Warren G. Kruse, Jay G. Heiser (2002). Computer forensics: incident response essentials. Addison-Wesley. p. 392. ISBN 0201707195. 
  26. ^ a b c d Sarah Mocas (February 2004). "Building theoretical underpinnings for digital forensics research". Digital Investigation 1 (1): 61–68. doi:10.1016/j.diin.2003.12.004. ISSN 1742-2876. 
  27. ^ Kanellis, Panagiotis (2006). Digital crime and forensic science in cyberspace. Idea Group Inc (IGI). p. 357. ISBN 1591408733. 
  28. ^ US v. Bonallo, 858 F. 2d 1427 (9th Cir. 1988). Text
  29. ^ "Federal Rules of Evidence #702". http://federalevidence.com/rules-of-evidence#Rule702. Retrieved 23 August 2010. 
  30. ^ A Yasinsac; RF Erbacher, DG Marks, MM Pollitt (2003). "Computer forensics education". IEEE Security & Privacy. http://ieeexplore.ieee.org/iel5/8013/27399/01219052.pdf?arnumber=1219052. Retrieved 26 July 2010. 
  31. ^ "Technology Crime Investigation :: Mobile forensics". Archived from the original on 17 May 2008. http://web.archive.org/web/20080517022757/http://www.daemon.be/maarten/forensics.html#mob. Retrieved 18 August 2010. 
  32. ^ Gary Palmer, A Road Map for Digital Forensic Research, Report from DFRWS 2001, First Digital Forensic Research Workshop, Utica, New York, August 7 – 8, 2001, Page(s) 27–30
  33. ^ "2 Russians Face Hacking Charges". Moscow Times. 24. http://www.themoscowtimes.com/news/article/2-russians-face-hacking-charges/253844.html. Retrieved 3 September 2010. 
  34. ^ Olivier, Martin S. (March 2009). "On metadata context in Database Forensics". Science Direct. doi:10.1016/j.diin.2008.10.001.. http://www.sciencedirect.com/science/article/B7CW4-4TSD9G6-1/2/a5031117d753054d92f2afba332eadf8. Retrieved 2 August 2010. 

Further reading

?

파일시스템
2011.03.17 13:44

Data Integrity and Recoverability with NTFS

조회 수 1487 추천 수 0 댓글 0
?

단축키

Prev이전 문서

Next다음 문서

크게 작게 위로 아래로 댓글로 가기 인쇄 수정 삭제
?

단축키

Prev이전 문서

Next다음 문서

크게 작게 위로 아래로 댓글로 가기 인쇄 수정 삭제

Data Integrity and Recoverability with NTFS

NTFS is a recoverable file system that guarantees the consistency of the volume by using standard transaction logging and recovery techniques. In the event of a disk failure, NTFS restores consistency by running a recovery procedure that accesses information stored in a log file. The NTFS recovery procedure is exact, guaranteeing that the volume is restored to a consistent state. Transaction logging requires a very small amount of overhead.

NTFS ensures the integrity of all NTFS volumes by automatically performing HDD recovery operations the first time a program accesses an NTFS volume after the computer is restarted following a failure.

NTFS also uses a technique called cluster remapping to minimize the effects of a bad sector on an NTFS volume.

Important
If either the master boot record (MBR) or boot sector is corrupted, you might not be able to access data on the volume.

Recovering Data with NTFS

NTFS views each I/O operation that modifies a system file on the NTFS volume as a transaction, and manages each one as an integral unit. Once started, the transaction is either completed or, in the event of a disk failure, rolled back (such as when the NTFS volume is returned to the state it was in before the transaction was initiated).

To ensure that a transaction can be completed or rolled back, NTFS records the suboperations of a transaction in a log file before they are written to the disk. When a complete transaction is recorded in the log file, NTFS performs the suboperations of the transaction on the volume cache. After NTFS updates the cache, it commits the transaction by recording in the log file that the entire transaction is complete.

Once a transaction is committed, NTFS ensures that the entire transaction appears on the volume, even if the disk fails. During recovery operations, NTFS redoes each committed transaction found in the log file. Then NTFS locates the transactions in the log file that were not committed at the time of the system failure and undoes each transaction suboperation recorded in the log file. Incomplete modifications to the volume are prohibited.

NTFS uses the Log File service to log all redo and undo information for a transaction. NTFS uses the redo information to repeat the transaction. The undo information enables NTFS to undo transactions that are not complete or that have an error.

Important
NTFS uses transaction logging and recovery to guarantee that the volume structure is not corrupted. For this reason, all system files remain accessible after a system failure. However, user data can be lost because of a system failure or a bad sector.

Cluster Remapping

In the event of a bad-sector error, NTFS implements a recovery technique called cluster remapping. When Windows 2000 detects a bad-sector, NTFS dynamically remaps the cluster containing the bad sector and allocates a new cluster for the data. If the error occurred during a read, NTFS returns a read error to the calling program, and the data is lost. If the error occurs during a write, NTFS writes the data to the new cluster, and no data is lost.

NTFS puts the address of the cluster containing the bad sector in its bad cluster file so the bad sector is not reused.

Important
Cluster remapping is not a backup alternative. Once errors are detected, the disk should be monitored closely and replaced if the defect list grows. This type of error is displayed in the Event Log.

?

조회 수 2019 추천 수 0 댓글 0
?

단축키

Prev이전 문서

Next다음 문서

크게 작게 위로 아래로 댓글로 가기 인쇄 첨부 수정 삭제
?

단축키

Prev이전 문서

Next다음 문서

크게 작게 위로 아래로 댓글로 가기 인쇄 첨부 수정 삭제

The set of AT Attachment standards consists of this standard and the ATA implementation standards described in ATA8-AAM. The AT Attachment ATA Command Set (ATA8-ACS) specifies the command set host systems use to access storage devices. It provides a common command set for systems manufacturers, system integrators, software suppliers, and suppliers of intelligent storage devices.

?

조회 수 1125 추천 수 0 댓글 0
?

단축키

Prev이전 문서

Next다음 문서

크게 작게 위로 아래로 댓글로 가기 인쇄 첨부 수정 삭제
?

단축키

Prev이전 문서

Next다음 문서

크게 작게 위로 아래로 댓글로 가기 인쇄 첨부 수정 삭제

This is the seventh ATA/ATAPI standard. Released in 2003.

This standard specifies the AT Attachment Interface between host systems and storage devices. It provides a common attachment interface for systems manufacturers, system integrators, software suppliers, and suppliers of intelligent storage devices.

The application environment for the AT Attachment Interface is any host system that has storage devices contained within the processor enclosure.

Volume 1 defines the register delivered commands used by devices implementing the standard.

Volume 2 defines the connectors and cables for physical interconnection between host and storage device, the electrical and logical characteristics of the interconnecting signals, and the protocols for the transporting commands, data, and status over the interface for the parallel interface.

Volume 3 defines the connectors and cables for physical interconnection between host and storage device, the electrical and logical characteristics of the interconnecting signals, and the protocols for the transporting commands, data, and status over the interface for the serial interface.

?

조회 수 1101 추천 수 0 댓글 0
?

단축키

Prev이전 문서

Next다음 문서

크게 작게 위로 아래로 댓글로 가기 인쇄 첨부 수정 삭제
?

단축키

Prev이전 문서

Next다음 문서

크게 작게 위로 아래로 댓글로 가기 인쇄 첨부 수정 삭제

This is the sixth ATA/ATAPI standard. Released in 2001.

This standard specifies the AT Attachment Interface between host systems and storage devices. It provides a common attachment interface for systems manufacturers, system integrators, software suppliers, and suppliers of intelligent storage devices.

The application environment for the AT Attachment Interface is any host system that has storage devices contained within the processor enclosure.

This standard defines the connectors and cables for physical interconnection between host and storage device, as well as the electrical and logical characteristics of the interconnecting signals. It also defines the operational registers within the storage device, and the commands and protocols for the operation of the storage device.

This standard maintains a high degree of compatibility with the AT Attachment with Packet Interface - 5 standard (ATA/ATAPI-5), NCITS 340-2000, and while providing additional functions, is not intended to require changes to presently installed devices or existing software.

?

조회 수 1161 추천 수 0 댓글 0
?

단축키

Prev이전 문서

Next다음 문서

크게 작게 위로 아래로 댓글로 가기 인쇄 첨부 수정 삭제
?

단축키

Prev이전 문서

Next다음 문서

크게 작게 위로 아래로 댓글로 가기 인쇄 첨부 수정 삭제

This is the fifth ATA/ATAPI standard. Released in 2000.

This standard specifies the AT Attachment Interface between host systems and storage devices. It provides a common attachment interface for systems manufacturers, system integrators, software suppliers, and suppliers of intelligent storage devices.

The application environment for the AT Attachment Interface is any host system that has storage devices contained within the processor enclosure.

This standard defines the connectors and cables for physical interconnection between host and storage device, as well as, the electrical and logical characteristics of the interconnecting signals. It also defines the operational registers within the storage device, and the commands and protocols for the operation of the storage device.

This standard maintains a high degree of compatibility with the AT Attachment with Packet Interface Extensions standard (ATA/ATAPI-4), NCITS 317-1998, and while providing additional functions, is not intended to require changes to presently installed devices or existing software.

?

파일시스템
2011.03.17 13:41

$EFS Attribute

조회 수 1513 추천 수 0 댓글 0
?

단축키

Prev이전 문서

Next다음 문서

크게 작게 위로 아래로 댓글로 가기 인쇄 수정 삭제
?

단축키

Prev이전 문서

Next다음 문서

크게 작게 위로 아래로 댓글로 가기 인쇄 수정 삭제

$EFS Attribute

When NTFS encrypts file, it sets flag Encrypted (0x4000) for the file and creates $EFS attribute for the file where it stores DDFs and DDRs. This attribute has Attribute ID = 0x100 in NTFS and can be pretty lengthy, occupying from 0.5K to several kilobytes depending on number of DDFs and DRFs.

Here's an example of $EFS attribute with more details.

$EFS attribute size

Computer SID and user number. It specifies folder where EFS stores certificates. In order to get folder name EFS makes some manipulations:

5A56B378 1C365429 A851FF09 D040000 - data stored in $EFS,

78B3565A 2954361C 09FF15A8 000004D0 - reversed

2025018970-693384732-167712168-1232 - converte to decimal

S-1-5-21-2025018970-693384732-167712168-1232 - SID prefix added

So, the folder will be %User Profile%Application DataMicrosoftCryptoRSAS-1-5-21-2025018970-693384732-167712168-1232

Public key thumbprint

Private key GUID (also used as container name). This name EFS uses when it aquires context from CryptoAPI provider. If there's only one DDFin $EFS attribute, container name can be figured out from $EFS (this field), but as more users added to the file (more DDFs or DRFs), PK GUID is not stored for all of them and must be retrieved from certificate storage based on public key thumbprint.

Cryptographic provider name = Microsoft Base Cryptographic Provider v.1.0

User name, to whom current DDF or DRF belongs

Encrypted FEK. Usually FEK is 128-bit long (in case of DESX) but since it's encrypted with 1024-bit RSA key, its encrypted length is 1024 bits.

?

Board Pagination Prev 1 2 3 4 5 6 Next
/ 6