Spotlight: Demos Gallery
Physiofunctional circadian metabolism...
PFun Glucose Demo
Interactive simulation of circadian-ultradian glucose dynamics (over a 24-hour period)
A quick prelude...
(EDITED: 2025-08-19)
Before you launch into these results... I'll give you some extra special ~~~insider information~~~ππ...π€«π
I have another demo that's almost ready to show off...
Likely I'll have some time to write a sufficient blog post in the near future.
For now, I'll leave you with a tasty hint...
aRtIfIcIaL π€π nEuRaL π€π nEtWoRkS π€π...πππ... . π€πͺ...π€Έπ€Έπ€Έ
~~~π€π€π€...~~π«’~~~...π€«.~~π€~~~~
An interactive biophysical simulation of glucose dynamics, estimated over a single 24-hour period. The curves shown represent: Meal-likelihood [0.0, 1.0], Glucose [0.0, 2.0], Insulin. This is a simplified version of the full model to avoid too much computational overhead (we're really maxing out Desmos here).
Dexcom Developer OpenAPI schema
Dexcom Developer API (openapi.json)
Here's a (working) OpenAPI schema definition (openapi.json) that I put together as part of my PFun Digital Health AI product pipeline.
The linked openapi.json file can be used to easily generate a client or proxy server to the Dexcom Developer API.
Relevant links:
- Source: GitHub repository: rockybranches/dexcom-openapi-schema
- API Docs: Dexcom Developer API documentation site
- What is OpenAPI...?
![]() |
| AI-generated image (DALL-E) |
Learning about "gie", a CLI tool for Geographic information Framework (PROJ) validation
I recently happened across "gie" (a.k.a. "Geospatial Integrity Investigation Environment").
See below for a AI-generated sample use-case. This example was generated using the mods CLI tool from Charm. The LLM backend used in this example is gpt-4o.mods -f 'show an example of using the CLI "gie" tool for proj validation
defined in the Geographic Information Framework (PROJ). Here's an example of
how to use the "gie" tool for proj validation:
### Step-by-Step Example
1. Create a GIE File:
• Write a GIE script that specifies the transformation you want to
validate. Save it as example.gie .
<gie>
operation +proj=merc +lat_ts=56.5
coordinate 12dE 56dN
expect 1335833.89 7514065.63
operation +proj=utm +zone=33 +ellps=WGS84
coordinate 12dE 56dN
expect 500000 6219671.61
2. Run the GIE Command:
• Open your terminal.
• Navigate to the directory where the example.gie file is located.
• Run the validation using the gie tool.
gie example.gie
3. Interpret the Output:
• The gie tool will read the script and execute the transformations.
• It will compare the actual output with the expected output mentioned in
the expect lines.
• If the actual results match the expectations, the tool will confirm the
validation.
• Any discrepancies will be shown in the output for further investigation.
This example uses two different operations to transform geographical
coordinates and verify that the results are as expected. Adjust the
operation , coordinate , and expect lines according to your specific
validation needs.
Conversation saved: 9dc63cb show an example of using the CLI "gie" tool for proj validation
PFun Fiona, AI-MD
A sneak peek of PFun Fiona, AI-MD-- the one-and-only Physiofunctional Digital Health Assistant.
Messing around with music & stuff
(Updated: 2023-05-16)
I've been messing around with the read/write methods in scipy.io.wavfile to make some "music" using the output of some of my biophysical models.
Example 1
So for example, here's some simulation output with an audio track (caution, may be loud):
So another interesting point... these sounds can be parametrized...
Example 2
Here's another example with different parameters (again, caution of loudness):
So as you can see, there's a wide range of different sound patterns that can be produced, even with just this one model.
Next Steps...
I'm thinking I'll make a public API at some point in the near future so you can make your own physiological music... π
Update!
Some new examples... with rhythm!
Example 2-0
Example 2-1
Example 2-2
Dynamical systems model of cardio-respiratory interactions
Currently I'm working on a model of cardio-respiratory interactions. Given human experimental data, the model reproduces the average heart rate, the average respiratory phase duration (inspiration/expiration), and the Respiratory Sinus Arrhythmia (RSA). The most sophisticated part of the model is the respiratory central pattern generator (previously published). My current goal is to optimize the model so that it reproduces RSA for several individuals.
After I get it working with the current dataset-- which was measured while participants watched Disney's Fantasia-- the next step is to fit the model to real clinical data from septic ER patients. The end goal is for our collaborators to use our model as a basis for a machine learning algorithm that predicts the risk and most likely cause of death for septic patients.
Below are some pretty pictures generated by the model.
Haiku Twitter Bot
Over summer 2016, I decided to make a Twitter bot just for fun. Here, I'll describe the bot's programming at a high level. But first, a little background...
I started undergraduate as an english major... then graduated with a degree in neuroscience. Although I have pretty firmly switched my career plans toward math and science, I still appreciate and respect art. As a reflection on my artistic pipedream, I designed my bot to write haikus. If you don't remember haikus from high school, hit that link in the previous sentence.
For connecting to Twitter, I used the tweepy library. I simply wrote a subclass of StreamListener to process incoming tweets and write them to a file called tweets.txt. You can read more about this process in the tweepy docs (linked above).
Each time I start my bot, most of the high-level functionality is located in one function, do_tweet().
For counting syllables and determining parts of speech, I used NLTK.
Counting syllables:
Choosing words for the haiku:
As you can see, the algorithm randomly chooses a word of syllable length sw, which is generated by the pick_syl() function. If this is hard to interpret, don't worry. It took me a while to come up with an algorithm that would randomly choose words but still adhere to the 5/7/5 haiku format. The conditions like if ix == 0 and line == 0 are there to determine which line of the haiku is being written. In this case, the first word of the first line is being chosen. Then the lines:
sdict1 = dict([(sk, sv) for sk, sv in sdict.items() if pos_tag(word_tokenize(sk),
tagset='universal')[0][1]=='NOUN']) # Chooses a noun
gsyls = syl_of_size(sdict1, sw) # Chooses a noun of syllable length sw
Then, setting up the haiku in order and writing to the file haiku.txt:
So currently this bot would write haikus with randomly chosen words, which is cool... but we can do better! ;^)
Eventually when I have a little more time, I'll give meta ai ai haiku more smarts, but for now the bot uses a simple algorithm for determining the "best" tweets, then recycles words from these tweets. Dumb, I know... it's a work in progress.
The function get_best() chooses tweets with the highest "weight" to include in the bot's corpus for writing future haikus.
Here's a link to my Twitter bot meta ai ai haiku: robcapps.com/docs/haiku. Please feel free to ask questions in the comments section!
QGIS Area of Availability Plugin
I'm working on a set of QGIS plugins, and one that I think should be out there for grabs is an area of availability calculator. (EDIT: you should be able to install the plugin via the public QGIS repository soon)
Given a start point and a maximum distance to travel, we want to find the boundary of accessible regions via a road network.
The code for this calculation is below (taken from this page):
from PyQt4.QtCore import * from PyQt4.QtGui import * from qgis.core import * from qgis.gui import * from qgis.networkanalysis import * vl = qgis.utils.iface.mapCanvas().currentLayer() director = QgsLineVectorLayerDirector(vl, -1, '', '', '', 3) properter = QgsDistanceArcProperter() director.addProperter(properter) crs = qgis.utils.iface.mapCanvas().mapRenderer().destinationCrs() builder = QgsGraphBuilder(crs) pStart = QgsPoint(65.5462, 57.1509) delta = qgis.utils.iface.mapCanvas().getCoordinateTransform().mapUnitsPerPixel() * 1 rb = QgsRubberBand(qgis.utils.iface.mapCanvas(), True) rb.setColor(Qt.green) rb.addPoint(QgsPoint(pStart.x() - delta, pStart.y() - delta)) rb.addPoint(QgsPoint(pStart.x() + delta, pStart.y() - delta)) rb.addPoint(QgsPoint(pStart.x() + delta, pStart.y() + delta)) rb.addPoint(QgsPoint(pStart.x() - delta, pStart.y() + delta)) tiedPoints = director.makeGraph(builder, [pStart]) graph = builder.graph() tStart = tiedPoints[0] idStart = graph.findVertex(tStart) (tree, cost) = QgsGraphAnalyzer.dijkstra(graph, idStart, 0) upperBound = [] r = 2000.0 i = 0 while i < len(cost): if cost[i] > r and tree[i] != -1: outVertexId = graph.arc(tree [i]).outVertex() if cost[outVertexId] < r: upperBound.append(i) i = i + 1 for i in upperBound: centerPoint = graph.vertex(i).point() rb = QgsRubberBand(qgis.utils.iface.mapCanvas(), True) rb.setColor(Qt.red) rb.addPoint(QgsPoint(centerPoint.x() - delta, centerPoint.y() - delta)) rb.addPoint(QgsPoint(centerPoint.x() + delta, centerPoint.y() - delta)) rb.addPoint(QgsPoint(centerPoint.x() + delta, centerPoint.y() + delta)) rb.addPoint(QgsPoint(centerPoint.x() - delta, centerPoint.y() + delta))
I'm not quite finished with my plugin implementation, but you can check out the source code on my GitHub.
If you want to try your hand at developing a QGIS plugin, I strongly recommend starting with Plugin Builder.
Featured
-
PFun Glucose Demo Interactive simulation of circadian-ultradian glucose dynamics (over a 24-hour period) A ...
-
Dexcom Developer API ( openapi.json ) Here's a (working ) OpenAPI schema definition ( openapi.json ) that I put together as part of my ...








