MRT Dumps with GoBGP
I’ve been logging BGP route updates into MySQL for some time but this doesn’t scale well, and on my hobby VM system, requires that I dump the DB and start over once a day.
One of the things I’ve wanted to do for a long time is to be able to work with MRT files – to injest route updates from MRT files, and once I had that capability, to produce my own MRT files.
What is MRT?
MRT stands for Multi-Threaded Routing Toolkit. The RFCs describe the MRT format for routing information export. The format can be used to export routing protocol messages, state changes, and routing information base contents.
MRT is an industry standard. As a result, there are many standard tools that can create MRT files, libraries to read and write MRT files, and CLI programs to read MRT files and do searches in them.
- RFC6396: Multi-Threaded Routing Toolkit (MRT) Routing Information Export Format
- RFC6397: MRT BGP Routing Information Export Format with Geo-Location Extensions
- RFC8050: MRT Routing Information Export Format with BGP Additional Path Extensions
What took so long?
Long ago I didn’t know what MRT files were.
Then I realized that BGP updates could be logged, but I didn’t find any tools that I liked then to read them.
Years later I realized that RIB state could be dumped as well too. Around this same time I used a few different tools to create MRT dumps (OpenBSD, a branch of BIRD with support for it), and a few different tools to try to read my own dumps, and others dumps (mostly RIPE RIS bviews). Various tools were easier than others, but none of the tools really made me that happy.
Recently I discovered BGP Scanner from Isolario. Its ultra fast, and had a machine and human readable pipe-separated output that I thought would lend itself well to being piped into something else. I wrote mrt2mysql.py almost immediately and was importing select data into a MySQL DB from MRT right away (select data instead of . from the DFZ).
MRT Data Repos
GoBGP can be configured using TOML (default), YAML, or HCL. The examples are MOSTLY in TOML, which Hugo also uses, so I went with that.
[global.config] as = 65535 router-id = "192.0.2.0" [[neighbors]] [neighbors.config] neighbor-address = "192.0.2.255" peer-as = 65534 [neighbors.add-paths.config] receive = true [neighbors.ebgp-multihop.config] enabled = true multihop-ttl = 32 [neighbors.transport.config] passive-mode = false [[neighbors]] [neighbors.config] neighbor-address = "2001:DB8::255" peer-as = 65534 [neighbors.add-paths.config] receive = true [neighbors.ebgp-multihop.config] enabled = true multihop-ttl = 32 [neighbors.transport.config] passive-mode = false [[mrt-dump]] [mrt-dump.config] dump-type = "updates" file-name = "/mrt/updates/2006_01/20060102.1504.dump" rotation-interval = 300 [[mrt-dump]] [mrt-dump.config] dump-type = "table" file-name = "/mrt/rib/2006_01/20060102.1504.dump" rotation-interval = 28800
You might wonder why the file name seems to be referencing a certain time on January 2nd in 2006? Thats the reference date/time that Go date formatting uses.
The rotation interval for the RIB dumps are every 8 hours, and the BGP updates are put into a new file every 5 minutes.
If I start
gobgpd at midnight, my RIB dumps are written at midnight, 8am, and 4pm (0000, 0800, and 1600). This is what RIPE does, and seems a reasonable trade off, given that all the updates in between each 8h interval are logged.
I’m currently updating my Canadian BGP Routing data in a loop with only route updates that contain
as$, feeding a list of Canadian ASNs from
blockfinder. This allows me to feed data from my own collection as well as locations where major Canadian carriers peer in the US (New York seems like a good one). This is providing me with a very accurate view of Canadian AS-PATHs, including views that are only seen locally.
BGP Scanner Script
MRTFILE="/mrt/rib/2019_01/20190130.0000.dump" for as in `cat /mrt/ca-asn-latest.txt`; do echo "$as\$" bgpscanner -p "$as\$" $MRTFILE | /path/to/mrt2mysql/mrt2mysql-single.py done
The GoBGP Github project README is very useful and has links to all their documentation. Selected pieces that were referenced for this blog are: