# Image Classification Sample
This topic demonstrates how to build and run the Image Classification sample application, which does
inference using image classification networks like AlexNet and GoogLeNet.
## Running
Running the application with the -h option yields the following usage message:
```sh
./classification_sample -h
InferenceEngine:
API version ............
Build ..................
classification_sample [OPTION]
Options:
-h
Print a usage message.
-i "" ""
Required. Path to a folder with images or path to an image files: a .ubyte file for LeNet
and a .bmp file for the other networks.
-m ""
Required. Path to an .xml file with a trained model.
-l ""
Optional. Absolute path to library with MKL-DNN (CPU) custom layers (*.so).
Or
-c ""
Optional. Absolute path to clDNN (GPU) custom layers config (*.xml).
-pp ""
Path to a plugin folder.
-d ""
Specify the target device to infer on; CPU, GPU, FPGA or MYRIAD is acceptable. Sample will look for a suitable plugin for device specified
-nt ""
Number of top results (default 10)
-ni ""
Number of iterations (default 1)
-pc
Enables per-layer performance report
-p_msg
Enables messages from a plugin
```
Running the application with the empty list of options yields the usage message given above and an error message.
You can do inference on an image using a trained AlexNet network on Intel® Processors using the following command:
```sh
./classification_sample -i /cat.bmp -m /alexnet_fp32.xml
```
### Outputs
By default the application outputs top-10 inference results.
Add the -nt option to the previous command to modify the number of top output results.
For example, to get the top-5 results on Intel® HD Graphics, use the following commands:
```sh
./classification_sample -i /cat.bmp -m /alexnet_fp32.xml -nt 5 -d GPU
```
### How it works
Upon the start-up the sample application reads command line parameters and loads a network and an image to the Inference
Engine plugin. When inference is done, the application creates an
output image and outputs data to the standard output stream.
## See Also
* [Using Inference Engine Samples](./docs/Inference_Engine_Developer_Guide/Samples_Overview.md)