Autopsy  4.13.0
Graphical digital forensics platform for The Sleuth Kit and other tools.
Classes | Private Member Functions | Static Private Member Functions | Private Attributes | Static Private Attributes | List of all members
org.sleuthkit.autopsy.ingest.DataSourceIngestJob Class Reference


class  Snapshot
enum  Stages

Private Member Functions

void addIngestModules (List< IngestModuleTemplate > templates, IngestModuleType type, SleuthkitCase skCase) throws TskCoreException
void checkForStageCompleted ()
void createIngestPipelines ()
void finish ()
void finishFirstStage ()
boolean hasFileIngestPipeline ()
boolean hasFirstStageDataSourceIngestPipeline ()
boolean hasSecondStageDataSourceIngestPipeline ()
void logErrorMessage (Level level, String message, Throwable throwable)
void logErrorMessage (Level level, String message)
void logInfoMessage (String message)
void logIngestModuleErrors (List< IngestModuleError > errors)
void startDataSourceIngestProgressBar ()
void startFileIngestProgressBar ()
void startFirstStage ()
void startSecondStage ()
List< IngestModuleErrorstartUpIngestPipelines ()

Static Private Member Functions

static List< IngestModuleTemplategetConfiguredIngestModuleTemplates (Map< String, IngestModuleTemplate > ingestModuleTemplates, List< String > pipelineConfig)

Private Attributes

volatile IngestJob.CancellationReason cancellationReason = IngestJob.CancellationReason.NOT_CANCELLED
volatile boolean cancelled
final List< String > cancelledDataSourceIngestModules = new CopyOnWriteArrayList<>()
final long createTime
volatile boolean currentDataSourceIngestModuleCancelled
DataSourceIngestPipeline currentDataSourceIngestPipeline
String currentFileIngestModule = ""
String currentFileIngestTask = ""
final Content dataSource
final Object dataSourceIngestPipelineLock = new Object()
ProgressHandle dataSourceIngestProgress
final Object dataSourceIngestProgressLock = new Object()
final boolean doUI
long estimatedFilesToProcess
final List< FileIngestPipeline > fileIngestPipelines = new ArrayList<>()
final LinkedBlockingQueue< FileIngestPipeline > fileIngestPipelinesQueue = new LinkedBlockingQueue<>()
ProgressHandle fileIngestProgress
final Object fileIngestProgressLock = new Object()
final List< AbstractFile > files = new ArrayList<>()
final List< String > filesInProgress = new ArrayList<>()
DataSourceIngestPipeline firstStageDataSourceIngestPipeline
final long id
volatile IngestJobInfo ingestJob
final List< IngestModuleInfo > ingestModules = new ArrayList<>()
final IngestJob parentJob
long processedFiles
DataSourceIngestPipeline secondStageDataSourceIngestPipeline
final IngestJobSettings settings
volatile Stages stage = DataSourceIngestJob.Stages.INITIALIZATION
final Object stageCompletionCheckLock = new Object()

Static Private Attributes

static final Logger logger = Logger.getLogger(DataSourceIngestJob.class.getName())
static final AtomicLong nextJobId = new AtomicLong(0L)
static final IngestTasksScheduler taskScheduler = IngestTasksScheduler.getInstance()

Detailed Description

Encapsulates a data source and the ingest module pipelines used to process it.

Definition at line 58 of file

Member Function Documentation

void org.sleuthkit.autopsy.ingest.DataSourceIngestJob.addIngestModules ( List< IngestModuleTemplate templates,
IngestModuleType  type,
SleuthkitCase  skCase 
) throws TskCoreException
void org.sleuthkit.autopsy.ingest.DataSourceIngestJob.checkForStageCompleted ( )
void org.sleuthkit.autopsy.ingest.DataSourceIngestJob.createIngestPipelines ( )

Creates the file and data source ingest pipelines.

Make mappings of ingest module factory class names to templates.

Use the mappings and the ingest pipelines configuration to create ordered lists of ingest module templates for each ingest pipeline.

Add any module templates that were not specified in the pipelines configuration to an appropriate pipeline - either the first stage data source ingest pipeline or the file ingest pipeline.

Construct the data source ingest pipelines.

Construct the file ingest pipelines, one per file ingest thread.

The current thread was interrupted while blocked on a full queue. Blocking should actually never happen here, but reset the interrupted flag rather than just swallowing the exception.

Definition at line 221 of file

References org.sleuthkit.autopsy.ingest.DataSourceIngestJob.addIngestModules(), org.sleuthkit.autopsy.ingest.DataSourceIngestJob.getConfiguredIngestModuleTemplates(), org.sleuthkit.autopsy.casemodule.Case.getCurrentCaseThrows(), org.sleuthkit.autopsy.ingest.IngestJobSettings.getEnabledIngestModuleTemplates(), org.sleuthkit.autopsy.ingest.IngestManager.getInstance(), org.sleuthkit.autopsy.ingest.IngestManager.getNumberOfFileIngestThreads(), org.sleuthkit.autopsy.casemodule.Case.getSleuthkitCase(), and org.sleuthkit.autopsy.ingest.DataSourceIngestJob.logErrorMessage().

void org.sleuthkit.autopsy.ingest.DataSourceIngestJob.finish ( )
void org.sleuthkit.autopsy.ingest.DataSourceIngestJob.finishFirstStage ( )
static List<IngestModuleTemplate> org.sleuthkit.autopsy.ingest.DataSourceIngestJob.getConfiguredIngestModuleTemplates ( Map< String, IngestModuleTemplate ingestModuleTemplates,
List< String >  pipelineConfig 

Uses an input collection of ingest module templates and a pipeline configuration, i.e., an ordered list of ingest module factory class names, to create an ordered output list of ingest module templates for an ingest pipeline. The ingest module templates are removed from the input collection as they are added to the output collection.

ingestModuleTemplatesA mapping of ingest module factory class names to ingest module templates.
pipelineConfigAn ordered list of ingest module factory class names representing an ingest pipeline.
An ordered list of ingest module templates, i.e., an uninstantiated pipeline.

Definition at line 314 of file

Referenced by org.sleuthkit.autopsy.ingest.DataSourceIngestJob.createIngestPipelines().

boolean org.sleuthkit.autopsy.ingest.DataSourceIngestJob.hasFileIngestPipeline ( )

Checks to see if this job has a file level ingest pipeline.

True or false.

Definition at line 406 of file

Referenced by org.sleuthkit.autopsy.ingest.DataSourceIngestJob.startFirstStage().

boolean org.sleuthkit.autopsy.ingest.DataSourceIngestJob.hasFirstStageDataSourceIngestPipeline ( )

Checks to see if this job has a first stage data source level ingest pipeline.

True or false.

Definition at line 387 of file

Referenced by org.sleuthkit.autopsy.ingest.DataSourceIngestJob.startFirstStage().

boolean org.sleuthkit.autopsy.ingest.DataSourceIngestJob.hasSecondStageDataSourceIngestPipeline ( )

Checks to see if this job has a second stage data source level ingest pipeline.

True or false.

Definition at line 397 of file

Referenced by org.sleuthkit.autopsy.ingest.DataSourceIngestJob.finishFirstStage().

void org.sleuthkit.autopsy.ingest.DataSourceIngestJob.logErrorMessage ( Level  level,
String  message,
Throwable  throwable 

Writes an error message to the application log that includes the data source name, data source object id, and the job id.

levelThe logging level for the message.
messageThe message.
throwableThe throwable associated with the error.

Definition at line 1087 of file


Referenced by org.sleuthkit.autopsy.ingest.DataSourceIngestJob.createIngestPipelines(), org.sleuthkit.autopsy.ingest.DataSourceIngestJob.finish(), and org.sleuthkit.autopsy.ingest.DataSourceIngestJob.logIngestModuleErrors().

void org.sleuthkit.autopsy.ingest.DataSourceIngestJob.logErrorMessage ( Level  level,
String  message 

Writes an error message to the application log that includes the data source name, data source object id, and the job id.

levelThe logging level for the message.
messageThe message.

Definition at line 1098 of file


void org.sleuthkit.autopsy.ingest.DataSourceIngestJob.logInfoMessage ( String  message)
void org.sleuthkit.autopsy.ingest.DataSourceIngestJob.logIngestModuleErrors ( List< IngestModuleError errors)
void org.sleuthkit.autopsy.ingest.DataSourceIngestJob.startDataSourceIngestProgressBar ( )
void org.sleuthkit.autopsy.ingest.DataSourceIngestJob.startFileIngestProgressBar ( )
void org.sleuthkit.autopsy.ingest.DataSourceIngestJob.startFirstStage ( )

Starts the first stage of this job.

Start one or both of the first stage ingest progress bars.

Make the first stage data source level ingest pipeline the current data source level pipeline.

Schedule the first stage tasks.

No data source ingest task has been scheduled for this stage, and it is possible, if unlikely, that no file ingest tasks were actually scheduled since there are files that get filtered out by the tasks scheduler. In this special case, an ingest thread will never get to check for completion of this stage of the job, so do it now.

Definition at line 491 of file

References org.sleuthkit.autopsy.ingest.DataSourceIngestJob.checkForStageCompleted(), org.sleuthkit.autopsy.ingest.DataSourceIngestJob.dataSourceIngestPipelineLock, org.sleuthkit.autopsy.ingest.DataSourceIngestJob.fileIngestProgressLock, org.sleuthkit.autopsy.ingest.DataSourceIngestJob.Stages.FIRST, org.sleuthkit.autopsy.ingest.DataSourceIngestJob.firstStageDataSourceIngestPipeline, org.sleuthkit.autopsy.ingest.DataSourceIngestJob.hasFileIngestPipeline(), org.sleuthkit.autopsy.ingest.DataSourceIngestJob.hasFirstStageDataSourceIngestPipeline(), org.sleuthkit.autopsy.ingest.DataSourceIngestJob.logInfoMessage(), org.sleuthkit.autopsy.ingest.DataSourceIngestJob.startDataSourceIngestProgressBar(), org.sleuthkit.autopsy.ingest.DataSourceIngestJob.startFileIngestProgressBar(), and org.sleuthkit.autopsy.ingest.DataSourceIngestJob.taskScheduler.

void org.sleuthkit.autopsy.ingest.DataSourceIngestJob.startSecondStage ( )
List<IngestModuleError> org.sleuthkit.autopsy.ingest.DataSourceIngestJob.startUpIngestPipelines ( )

Starts up each of the ingest pipelines for this job to collect any file and data source level ingest modules errors that might occur.

A collection of ingest module startup errors, empty on success.

Definition at line 443 of file

References org.sleuthkit.autopsy.ingest.DataSourceIngestJob.logIngestModuleErrors().

Member Data Documentation

volatile IngestJob.CancellationReason org.sleuthkit.autopsy.ingest.DataSourceIngestJob.cancellationReason = IngestJob.CancellationReason.NOT_CANCELLED

Definition at line 139 of file

volatile boolean org.sleuthkit.autopsy.ingest.DataSourceIngestJob.cancelled

Definition at line 138 of file

final List<String> org.sleuthkit.autopsy.ingest.DataSourceIngestJob.cancelledDataSourceIngestModules = new CopyOnWriteArrayList<>()

Definition at line 137 of file

final long org.sleuthkit.autopsy.ingest.DataSourceIngestJob.createTime

A data source ingest job uses this field to report its creation time.

Definition at line 177 of file

volatile boolean org.sleuthkit.autopsy.ingest.DataSourceIngestJob.currentDataSourceIngestModuleCancelled

A data source ingest job supports cancellation of either the currently running data source level ingest module or the entire ingest job.

TODO: The currentDataSourceIngestModuleCancelled field and all of the code concerned with it is a hack to avoid an API change. The next time an API change is legal, a cancel() method needs to be added to the IngestModule interface and this field should be removed. The "ingest job is canceled" queries should also be removed from the IngestJobContext class.

Definition at line 136 of file

DataSourceIngestPipeline org.sleuthkit.autopsy.ingest.DataSourceIngestJob.currentDataSourceIngestPipeline

Definition at line 113 of file

String org.sleuthkit.autopsy.ingest.DataSourceIngestJob.currentFileIngestModule = ""

Definition at line 169 of file

String org.sleuthkit.autopsy.ingest.DataSourceIngestJob.currentFileIngestTask = ""

Definition at line 170 of file

final Content org.sleuthkit.autopsy.ingest.DataSourceIngestJob.dataSource

Definition at line 72 of file

final Object org.sleuthkit.autopsy.ingest.DataSourceIngestJob.dataSourceIngestPipelineLock = new Object()

A data source ingest job has separate data source level ingest module pipelines for the first and second processing stages. Longer running, lower priority modules belong in the second stage pipeline, although this cannot be enforced. Note that the pipelines for both stages are created at job start up to allow for verification that they both can be started up without errors.

Definition at line 110 of file

Referenced by org.sleuthkit.autopsy.ingest.DataSourceIngestJob.startFirstStage(), and org.sleuthkit.autopsy.ingest.DataSourceIngestJob.startSecondStage().

ProgressHandle org.sleuthkit.autopsy.ingest.DataSourceIngestJob.dataSourceIngestProgress

Definition at line 158 of file

final Object org.sleuthkit.autopsy.ingest.DataSourceIngestJob.dataSourceIngestProgressLock = new Object()
final boolean org.sleuthkit.autopsy.ingest.DataSourceIngestJob.doUI

A data source ingest job can run interactively using NetBeans progress handles.

Definition at line 151 of file

long org.sleuthkit.autopsy.ingest.DataSourceIngestJob.estimatedFilesToProcess

Definition at line 166 of file

final List<FileIngestPipeline> org.sleuthkit.autopsy.ingest.DataSourceIngestJob.fileIngestPipelines = new ArrayList<>()

Definition at line 123 of file

final LinkedBlockingQueue<FileIngestPipeline> org.sleuthkit.autopsy.ingest.DataSourceIngestJob.fileIngestPipelinesQueue = new LinkedBlockingQueue<>()

A data source ingest job has a collection of identical file level ingest module pipelines, one for each file level ingest thread in the ingest manager. A blocking queue is used to dole out the pipelines to the threads and an ordinary list is used when the ingest job needs to access the pipelines to query their status.

Definition at line 122 of file

ProgressHandle org.sleuthkit.autopsy.ingest.DataSourceIngestJob.fileIngestProgress

Definition at line 168 of file

final Object org.sleuthkit.autopsy.ingest.DataSourceIngestJob.fileIngestProgressLock = new Object()
final List<AbstractFile> org.sleuthkit.autopsy.ingest.DataSourceIngestJob.files = new ArrayList<>()

Definition at line 73 of file

final List<String> org.sleuthkit.autopsy.ingest.DataSourceIngestJob.filesInProgress = new ArrayList<>()

Definition at line 165 of file

DataSourceIngestPipeline org.sleuthkit.autopsy.ingest.DataSourceIngestJob.firstStageDataSourceIngestPipeline
final long
volatile IngestJobInfo org.sleuthkit.autopsy.ingest.DataSourceIngestJob.ingestJob

Definition at line 172 of file

final List<IngestModuleInfo> org.sleuthkit.autopsy.ingest.DataSourceIngestJob.ingestModules = new ArrayList<>()

Definition at line 171 of file

final Logger org.sleuthkit.autopsy.ingest.DataSourceIngestJob.logger = Logger.getLogger(DataSourceIngestJob.class.getName())

Definition at line 60 of file

final AtomicLong org.sleuthkit.autopsy.ingest.DataSourceIngestJob.nextJobId = new AtomicLong(0L)

Definition at line 69 of file

final IngestJob org.sleuthkit.autopsy.ingest.DataSourceIngestJob.parentJob

These fields define a data source ingest job: the parent ingest job, an ID, the user's ingest job settings, and the data source to be analyzed. Optionally, there is a set of files to be analyzed instead of analyzing all of the files in the data source.

Definition at line 68 of file

long org.sleuthkit.autopsy.ingest.DataSourceIngestJob.processedFiles

Definition at line 167 of file

DataSourceIngestPipeline org.sleuthkit.autopsy.ingest.DataSourceIngestJob.secondStageDataSourceIngestPipeline
final IngestJobSettings org.sleuthkit.autopsy.ingest.DataSourceIngestJob.settings

Definition at line 71 of file

volatile Stages org.sleuthkit.autopsy.ingest.DataSourceIngestJob.stage = DataSourceIngestJob.Stages.INITIALIZATION

Definition at line 99 of file

final Object org.sleuthkit.autopsy.ingest.DataSourceIngestJob.stageCompletionCheckLock = new Object()
final IngestTasksScheduler org.sleuthkit.autopsy.ingest.DataSourceIngestJob.taskScheduler = IngestTasksScheduler.getInstance()

A data source ingest job uses the task scheduler singleton to create and queue the ingest tasks that make up the job.

Definition at line 145 of file

Referenced by org.sleuthkit.autopsy.ingest.DataSourceIngestJob.checkForStageCompleted(), org.sleuthkit.autopsy.ingest.DataSourceIngestJob.startFirstStage(), and org.sleuthkit.autopsy.ingest.DataSourceIngestJob.startSecondStage().

The documentation for this class was generated from the following file:

Copyright © 2012-2019 Basis Technology. Generated on: Tue Jan 7 2020
This work is licensed under a Creative Commons Attribution-Share Alike 3.0 United States License.