bindings: package ns-3 as a pip wheel
It turns out repackaging ns-3 as a pip package isn't that hard.
$ pip install cppyy cmake-build-extension
$ python3 setup.py bdist_wheel
$ pip install ./dist/ns_3* (some crazy name generated based on the git tree)
This wheel can be hosted at pypi, gitlab or github, so that users can just pip install it and get a released version.
Then we can import and use it.
$ python3 -c "from nsnam.ns import ns; nodes = ns.NodeContainer(); nodes.Create(2); print(nodes.Get(0).__deref__())"
<cppyy.gbl.ns3.Node object at 0x55e33d4a15c0>
I've used the nsnam as a install prefix and upper-level directory to ship visualizer too. nsnam.ns refers to the python bindings and nsnam.visualizer to the visualizer.
It can also be used for C++ development since it ships with all headers, CMake package and pkg-config stuff, but it is less then ideal since people won't be able to change the upstream libraries.
Opening as a draft, since I'm not sure upstreaming this is a good idea. But sounds pretty cool for deployment in educational environments. And it can totally be used with Jupyter without setting LD_LIBRARY_PATH (discovered I left a RPATH setting misconfigured), nor including directories to the PYTHONPATH.
Edit: Tom asked what would be the workflow with the packaged ns-3 and how it differs from what we currently have.
We have a few different cases to handle.
- Build from source: 1.1 Git cloning ns-3-dev; 1.2 Downloading ns3-allinone; 1.3 Using bake.
- Build with pre-built modules packages: 2.1 Homebrew (Mac); 2.2 Apt (Debian); 2.3 Spack; 2.4 Pypi (this MR).
For people that want to change upstream modules in C++, the only option still is building from source (cases 1.1, 1.2, 1.3).
For these people, the python bindings will continue to work exactly how it currently is working. Nothing will change.
./ns3 configure --enable-python-bindings && ./ns3 run first.py
For people that just want to add new models with a custom module, or build simulation "scripts" in C++, but don't want to handle the configuration and build steps, pre-built packages are the way to go. Nothing will change for them either.
This MR simply adds another pre-built option, focused on serving all Python users (currently only for Linux).
A simple pip install nsnam and you can go straight to writing simulation scripts (in this case without the quotes).
There is no need for additional modules to interface with common ML frameworks, since you can call them from Python.
You can use Matplotlib for plotting, Scapy to process pcap files from simulations, built-in json/xml/pickle tools to load/dump data. Visualizer works just fine.
My use case for the packaged version is for educational use with Jupyter, but could also be used to distribute a pre-packaged build for reproduction (not uncommon for toolchains to break old software).
Since pkg-config and cmake package files are also included, the pip wheel can also be used by C++ users that do not want to change upstream modules.
Workflow flowchart
graph LR
B[You want to use ns-3] --> C{Pick a version}
C -->D{Source}
C -->E{Compiled}
D -->F[Git Clone]
D -->G[ns3-all-in-one]
D -->H[Bake]
E -->I[Apt - Debian]
E -->J[Homebrew - Mac]
E -->K[Spack]
E ---->L[Pypi -Linux]
F -->M[Configure]
G -->M
H -->M
I -->N
J -->N
K -->N
L ---->O
M -->N[Build]
N -->O[Run]
O -->P{Achieved the goal?}
P -->|No| Q[Modify]
Q -->|C++| M
Q -->|C++| N
Q -->|Python| O
P ---->|Yes| R[End]