Quick Start

This quick start guide is an abridged version of the full documentation where you'll run through a typical workflow with sample data. At the end of each bullet, there is a link to the specific full documentation section with a detailed explanation. You can click on the full documentation sections on the left 👈, or here: Setting Up Linking Collapsing Visualization

  • Install Julia from their website, making sure to add julia to your PATH. (Download and install Julia)
  • Open a terminal in a new empty folder, and type julia to open a julia REPL in that directory. (Open a Julia REPL in a directory)
  • Create a Julia environment and install MicroTracker into it using the following two commands. You can paste these into the Julia REPL.
] activate .
add MicroTracker
  • Import MicroTracker into your namespace using the following command. Make sure you press backspace to get out of the Pkg mode before running this command. Note that MicroTracker also automatically installs and imports a couple Python functions, see Python interface if any issues arise.
using MicroTracker
  • Create a new project with example data using the create_project_here function. This populates your empty folder with the structure for easily analyzing microscopy data with MicroTracker. The supplied data is only two videos with two independent variables for demonstration purposes, but MicroTracker is capable of processing hundreds of videos at once. (Create a MicroTracker project).
create_project_here()
translation_dict = Dict("f_Hz"=>(1, Int64), "B_mT"=>(2, Float64), "FPS"=>(3, Float64))
linking_settings = (MPP = 0.605, SEARCH_RANGE_MICRONS = 1000, MEMORY = 0, STUBS_SECONDS = 0.5)
  • Batch process segmented image data into linked time-series microbot trajectories using a single function batch_particle_data_to_linked_data. This combines all the processing steps into a single command that processes an entire experimental array. (Batch linking)
linked_data = batch_particle_data_to_linked_data(translation_dict, linking_settings)
  • Collapse/summarize linked data for comparing microbots across experiments and videos using the collapse_data function. (Collapsing)
collapsed_data = collapse_data(linked_data, translation_dict)
  • Filter the microbot trajectories based on their collapsed statistics. Microbots may need to be excluded from the data if they are too small or large for the study, going too slow, or stuck to the substrate. Use filter_trajectories for the most common filters. (Filtering based on collapsed data).
filter_settings = (
    MIN_VELOCITY = 10.0,  # um / s  
    MIN_BOUNDING_RADIUS = 3.38,  # um
    MAX_BOUNDING_RADIUS = 75,  # µm
    MIN_DISPLACEMENT = 0,  # µm
)

fcd = filtered_collapsed_data = filter_trajectories(collapsed_data, filter_settings)
  • View the data on a experiment wide scale with the collapsed_data. MicroTracker reexports Plots.jl and StatsPlots.jl for convenience.
@df fcd scatter(:R, :V, group=:B_mT, xlims=(0, 40), ylims=(0, 100), xlabel="R (µm)", ylabel="V (µm/s)", leg_title = "B (mT)")
chosen_particle = fcd.particle_unique[2]
trajectory_analyzer(linked_data, collapsed_data, chosen_particle)

What's next?

  1. Dive deeper with the included MicroTracker Pluto notebook and the the interactive version of the trajectory_analyzer on the Pluto page.
  2. Get started with using your own data and experiments in Experimental and Segmentation.