Commit 45108030 authored by David Daish's avatar David Daish

Final cleanup of branch

parent 24039675
......@@ -38,7 +38,7 @@
<physics type="ode">
<max_step_size>0.008</max_step_size> <!--Note: The higher this number, the faster the sim.-->
<max_step_size>0.01</max_step_size> <!--Note: The higher this number, the faster the sim.-->
......@@ -203,7 +203,7 @@ class PlankDrop : public WorldPlugin
double run_end;
uint32_t teardown_step;
unsigned int sim_runs = 0;
unsigned int max_sim_runs = 10;
unsigned int max_sim_runs = 30;
// Seconds to let the sim run for
double run_duration = 5;
......@@ -21,7 +21,8 @@ Each simulation run produces around 30 datasets.
for i in $(seq $batch_size); do docker run --rm -v ~/Downloads/dataset:/mnt/dataset gazebo; done
for i in $(seq $batch_size); do docker run --rm -v $dataset_output_dir:/mnt/dataset gazebo; done
### Run headless
......@@ -29,7 +30,8 @@ for i in $(seq $batch_size); do docker run --rm -v ~/Downloads/dataset:/mnt/data
This runs the simulation once, producing around 30 datasets.
docker run --rm -v ~/Downloads/dataset:/mnt/dataset gazebo
docker run --rm -v $dataset_output_dir:/mnt/dataset gazebo
### Run with GUI
......@@ -44,9 +46,11 @@ tag found in the `include/` file.
xhost +
docker run -it --rm \
-v ~/Downloads/dataset:/mnt/dataset \
-v $dataset_output_dir:/mnt/dataset \
-v /tmp/.X11-unix:/tmp/.X11-unix:ro \
gazebo gazebo --verbose
......@@ -64,7 +68,8 @@ This method requires the GNU Parallel application.
seq $batch_size | parallel -j-2 "docker run --rm -v ~/Downloads/dataset:/mnt/dataset --name=plank-drop-container-{} gazebo"
seq $batch_size | parallel -j-2 "docker run --rm -v $dataset_output_dir:/mnt/dataset --name=plank-drop-container-{} gazebo"
## Other
......@@ -47,7 +47,13 @@ ideas for increasing accuracy:
- The ghost models can be accessed using `ScenePtr->WorldVisual()->GetChild()`.
- I cannot find the cause, so I have resolved the issue by creating a simple,
hacky system that ends the process if more visuals are detected than expected.
- For a reason I cannot identify, the hacky solution will produce blank white
images as the only produced dataset.
- The NN would have no way to diferentiate the pose differences of a plank that
is symmetrical. The serialisation process needs to limit it to just a few
directions, such that the NN doesn't need to determine the exact pose, just the
relevant ones.
- The strange blank datasets need to be removed from the data before it trains the
neural net.
- The only way to prematurely end the serial batch dataset generation is to close
the terminal window, which isn't very graceful.
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment