Commit 45108030 authored by David Daish's avatar David Daish

Final cleanup of branch

parent 24039675
...@@ -38,7 +38,7 @@ ...@@ -38,7 +38,7 @@
<physics type="ode"> <physics type="ode">
<real_time_update_rate>0.0</real_time_update_rate> <real_time_update_rate>0.0</real_time_update_rate>
<max_step_size>0.008</max_step_size> <!--Note: The higher this number, the faster the sim.--> <max_step_size>0.01</max_step_size> <!--Note: The higher this number, the faster the sim.-->
</physics> </physics>
</world> </world>
</sdf> </sdf>
...@@ -203,7 +203,7 @@ class PlankDrop : public WorldPlugin ...@@ -203,7 +203,7 @@ class PlankDrop : public WorldPlugin
double run_end; double run_end;
uint32_t teardown_step; uint32_t teardown_step;
unsigned int sim_runs = 0; unsigned int sim_runs = 0;
unsigned int max_sim_runs = 10; unsigned int max_sim_runs = 30;
// Seconds to let the sim run for // Seconds to let the sim run for
double run_duration = 5; double run_duration = 5;
......
...@@ -21,7 +21,8 @@ Each simulation run produces around 30 datasets. ...@@ -21,7 +21,8 @@ Each simulation run produces around 30 datasets.
``` ```
batch_size=1000 batch_size=1000
for i in $(seq $batch_size); do docker run --rm -v ~/Downloads/dataset:/mnt/dataset gazebo; done dataset_output_dir=~/Downloads/dataset
for i in $(seq $batch_size); do docker run --rm -v $dataset_output_dir:/mnt/dataset gazebo; done
``` ```
### Run headless ### Run headless
...@@ -29,7 +30,8 @@ for i in $(seq $batch_size); do docker run --rm -v ~/Downloads/dataset:/mnt/data ...@@ -29,7 +30,8 @@ for i in $(seq $batch_size); do docker run --rm -v ~/Downloads/dataset:/mnt/data
This runs the simulation once, producing around 30 datasets. This runs the simulation once, producing around 30 datasets.
``` ```
docker run --rm -v ~/Downloads/dataset:/mnt/dataset gazebo dataset_output_dir=~/Downloads/dataset
docker run --rm -v $dataset_output_dir:/mnt/dataset gazebo
``` ```
### Run with GUI ### Run with GUI
...@@ -44,9 +46,11 @@ tag found in the `include/lit_world.world` file. ...@@ -44,9 +46,11 @@ tag found in the `include/lit_world.world` file.
``` ```
xhost + xhost +
dataset_output_dir=~/Downloads/dataset
docker run -it --rm \ docker run -it --rm \
-e DISPLAY=$DISPLAY \ -e DISPLAY=$DISPLAY \
-v ~/Downloads/dataset:/mnt/dataset \ -v $dataset_output_dir:/mnt/dataset \
-v /tmp/.X11-unix:/tmp/.X11-unix:ro \ -v /tmp/.X11-unix:/tmp/.X11-unix:ro \
gazebo gazebo --verbose lit_world.world gazebo gazebo --verbose lit_world.world
``` ```
...@@ -64,7 +68,8 @@ This method requires the GNU Parallel application. ...@@ -64,7 +68,8 @@ This method requires the GNU Parallel application.
``` ```
batch_size=1000 batch_size=1000
seq $batch_size | parallel -j-2 "docker run --rm -v ~/Downloads/dataset:/mnt/dataset --name=plank-drop-container-{} gazebo" dataset_output_dir=~/Downloads/dataset
seq $batch_size | parallel -j-2 "docker run --rm -v $dataset_output_dir:/mnt/dataset --name=plank-drop-container-{} gazebo"
``` ```
## Other ## Other
......
...@@ -47,7 +47,13 @@ ideas for increasing accuracy: ...@@ -47,7 +47,13 @@ ideas for increasing accuracy:
- The ghost models can be accessed using `ScenePtr->WorldVisual()->GetChild()`. - The ghost models can be accessed using `ScenePtr->WorldVisual()->GetChild()`.
- I cannot find the cause, so I have resolved the issue by creating a simple, - I cannot find the cause, so I have resolved the issue by creating a simple,
hacky system that ends the process if more visuals are detected than expected. hacky system that ends the process if more visuals are detected than expected.
- For a reason I cannot identify, the hacky solution will produce blank white
images as the only produced dataset.
- The NN would have no way to diferentiate the pose differences of a plank that - The NN would have no way to diferentiate the pose differences of a plank that
is symmetrical. The serialisation process needs to limit it to just a few is symmetrical. The serialisation process needs to limit it to just a few
directions, such that the NN doesn't need to determine the exact pose, just the directions, such that the NN doesn't need to determine the exact pose, just the
relevant ones. relevant ones.
- The strange blank datasets need to be removed from the data before it trains the
neural net.
- The only way to prematurely end the serial batch dataset generation is to close
the terminal window, which isn't very graceful.
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment