The Magic of Matplotlib Stylesheets; WasmEdge; Earth: Armed And Fully Operational
(No, my Substack is still not hacked.)
Speaking of hacked, however, now’s a good time as any to remind you to lock down any GitHub hosted domains you might have.
The Magic of Matplotlib Stylesheets
There are many ways to create data visualizations in Python. Here are a few:of resources available pertaining to Matplotlib, but it sure seems to be the de facto way for Python users to make plots.
Almost all the other charting libraries in the above list make interactive plots. I'm a big believer in "static-first" charts, as interactivity introduces complexity into the experience of the chart consumer, and I see far too many interactive data visualizations published that are just default Plotly or Altair exports that offer no guidance or curation (something I believe all consumer-facing interactive charts should have).
If you're still reading (thank you!), you're likely a Python user who defaults to using Matplotlib, and also likely rely on the Matplotlib defaults, which result in creations that make my eyes bleed every time I come across a Matplotlib chart on the internet. This reliance on Matplotlib's defaults also makes me think Python folks can't be bothered to RTFM, since Matplotlib supports stylesheets, and ships with batteries included (i.e. has many to choose from):
You can visit Matplotlib's docs to see each style rendered. Nearly every single one of them is better than the default, and using builtins is all well and good, but you aren't constrained into using one of them.
Robert Ritz (@RobertERitz) has a recent tutorial (Deepnote notebook version) on creating these style sheets that walks you through building one with heavy customizations. The section banner is from Robert's post and sports a much cleaner creation than you'd get with the defaults.
I highly recommend going over the post and coming up with "your style", whether it be something you use personally or one that makes use of the brand styles of your company/organization. Your charts will look better, be associated with you/your organization, and save my annual spend on Kleenex.
With my apologies for forcing you to stomach through that buzzword-laden sentence, here's the WasmEdge developers' pitch (which also has its share of buzz):
WasmEdge is a cloud-native WebAssembly runtime hosted by the Cloud Native Computing Foundation CNCF. It is widely used in edge computing, automotive, Jamstack, serverless, SaaS, service mesh, and even blockchain applications. Featuring AOT compiler optimization, WasmEdge is one of the fastest WebAssembly runtimes on the market today.
Modern web apps feature rich UIs that are rendered in the browser and/or on the edge cloud. WasmEdge works with popular web UI frameworks, such as React, Vue, Yew, and Percy, to support isomorphic server-side rendering (SSR) functions on edge servers. It could also support server-side rendering of Unity3D animations and AI-generated interactive videos for web applications on the edge cloud.
WasmEdge provides a lighweight, secure and high-performance runtime for microservices. It is fully compatible with application service frameworks such as Dapr, and service orchestrators like Kubernetes. WasmEdge microservices can run on edge servers, and have access to distributed cache, to support both stateless and stateful business logic functions for modern web apps. Also related: Serverless function-as-a-service in public clouds.
Serverless SaaS functions enables users to extend and customize their SaaS experience without operating their own API callback servers. The serverless functions can be embedded into the SaaS or reside on edge servers next to the SaaS servers. Developers simply upload functions to respond to SaaS events or to connect SaaS APIs.
Smart device apps could embed WasmEdge as a middleware runtime to render interactive content on the UI, connect to native device drivers, and access specialized hardware features (i.e, the GPU for AI inference). The benefits of the WasmEdge runtime over native-compiled machine code include security, safety, portability, manageability, and developer productivity. WasmEdge runs on Android, OpenHarmony, and seL4 real-time operating system (RTOS) devices
Far from being "yet-another-wasm-runtime", WasmEdge appears to have been designed from the ground-up for maximum:
Two examples in particular stood out to me, since I've been proselytizing the inevitability for Wasm to disrupt the data science/engineering community.
The first is their support for Tensorflow (link goes to full, working Rust source project):
AI inference is a computationally intensive task that could benefit greatly from the speed of Rust and WebAssembly. However, the standard WebAssembly sandbox provides very limited access to the native OS and hardware, such as multi-core CPUs, GPU and specialized AI inference chips. It is not ideal for the AI workload.
The popular WebAssembly System Interface (WASI) provides a design pattern for sandboxed WebAssembly programs to securely access native host functions. The WasmEdge Runtime extends the WASI model to support access to native Tensorflow libraries from WebAssembly programs. The WasmEdge Tensorflow Rust SDK provides the security, portability, and ease-of-use of WebAssembly and native speed for Tensorflow.
The other involves neural networks via Neural Network for WASI (WASI-NN). WASI-NN is WASI API for performing ML inference. Its name derives from the fact that ML models are also known as neural networks. ML models are typically trained using a large data set, resulting in one or more files that describe the model's weights. The model is then used to compute an "inference," e.g., the probabilities of classifying an image as a set of tags. This API is concerned initially with inference, not training.
That's right, WasmEdge can load, initialize, infer and return values based on open standards for neural network-based model serialization.
Earth: Armed And Fully Operational
Given the longform-ish nature of the previous sections, you get a short cool science link for dessert.
Lightning doesn't always make contact with the ground. Occasionally, lightning will exit the top of a thunderstorm and connect to the lower edge of space, forming a gigantic jet.
This month, a cadre of researchers published observations of a negative gigantic jet that transferred an extraordinary amount of charge between the troposphere and ionosphere back in 2018.
By studying the jet's radio-wave emissions using satellite and radar data, the team learned that the bolt moved approximately 300 coulombs of energy from the top of the cloud to the lower ionosphere — the layer of charged particles that separates Earth's upper atmosphere from the vacuum of space — or roughly 60 times the 5-coulomb output of a typical lightning bolt.
"The charge transfer is nearly double the previous largest by a gigantic jet and is comparable to the largest ever recorded for cloud-to-ground strokes," the researchers wrote in the study.
(☝🏽 via Live Science)
One coulomb is a YUGE charge. Two, one-meter apart single coulomb charges exert a force of over two million tonnes, which is roughly 720 times as much as the thrust of a space shuttle solid rocket booster during liftoff.
I'm thinking the Earth just has Death Star envy and decided to show off (ref: section header).
The journal article is surprisingly accessible (with some pretty neat charts to dig into), as is the Live Science post.
It'd be nice if, next time, one of those jets could be directed downward at 26.6771° N, 80.0370° W. ☮