Below are examples deemed worthy of the front page…
Find the tools in action on Heroku as a node.js app!
See the code on GitHub:
After many iterations of ideas regarding deployment for a few research Shiny R apps, I am glad to say the current web-only setup is 100% free and simple to adapt. I thought I’d go through some of the Node.JS bits I have been fussing with.
The Current one:
Heroku has a free tier for node.js apps. See the pricing and limitations here: https://www.heroku.com/pricing as far as I can tell, there is little reason to read too far into a free plan; they don’t have my credit card, and thy seem to convert enough folks to paid customers to be nice enough to offer a free something to everyone.
Shiny apps- https://www.shinyapps.io/– works straight from RStudio. They have a free plan. Similar to Heroku, I can care too much about limitations as it is completely free.
The reasons to use Node.JS (even if it just a jade/html wrapper) are numerous, though may not be completely obvious. If nothing else, Heroku will serve it for free….
Using node is nice because you get all the web-layout-ux-ui stacks of stuff if you need them. Clearly, I have not gone to many lengths to do that, but it is there.
Another big one is using node.js with Electron. https://electronjs.org/ The idea is a desktop app framework serves up your node app to itself, via the chromium. I had a bit of a foray with Electron- the node execa
npm install execa package let me launch a shiny server from electron, wait a moment, then load a node/browser app that acts as a interface to the shiny process. While this mostly worked, it is definitely overkill for my shiny stuff. Good to have as a tool though.
As many may intuit, I like the AWS ecosystem; it is easy to navigate and usually just works.
…However- more than 1000 dollars later, I no longer use AWS for most things….
Selective sync: I need a unsync function for projects and files due to the tiny 256 SSD on my laptop (odrive is great, just not perfect for cloud computing.
Shared file system: access files from Windows and OSX, locally and remote
Server must be headless, rebootable, and work remotely from under a heavy enterprise NAT (College)
Needs more than 8gb ram
Runs windows desktop remotely for gis applications, (OSX on my laptop)
Have as much shared file space as possible: 12TB+
Server: recycled, remote, works-under-enterprise-NAT:
Recycled Dell 3010 with i5: https://www.plymouth.edu/webapp/itsurplus/
– Cost: $75 (+ ~$200 in windows 10 pro, inevitable license expense)
+ free spare 16gb ram laying around, local SSD and 2TB HDD upgrades
– Does Microsoft-specific GIS bidding, can leave running without hampering productivity
Resilio (bittorrent) Selective sync: https://www.resilio.com/individuals/
– Cost: $60
– p2p Data management for remote storage + desktop
– Manages school NAT and port restrictions well (remote access via relay server)
Attached and syncs to 10TB additional drobo raid storage, repurposed for NTFS
What I see: front end-
Jump VNC Fluid service: https://jumpdesktop.com/
– Cost: ~$30
– Super efficient Fluid protocol, clients include chrome OS and IOS, (with mouse support!)
– Manages heavy NAT and port restrictions well
– GUI for everything, no tunneling around a CLI
Jetbrains development suite: https://www.jetbrains.com/ (OSX)
– Cost: FREE as a verified GitHub student user.
– PyCharm IDE, Webstorm IDE
Total (extra) spent: ~$165
(Example: my AWS bill for only October was $262)
If you happened to be working with…. KML data (or any data with large description strings) and transitioning it into the ESRI Story Map toolset, there is a very good chance you hit the the dBase 254 character length limit with the ESRI Shapefile upload. Shapefiles are always a terrible idea.
the solution: with GDAL or QGIS (alright, even in ArcMap), one can use GeoJSON as an output format AND import into the story map system- with complete long description strings!
Merge vector layers -> save to file -> GeoJSON
arcpy.env.workspace = “/desktop/arcmapstuff”
arcpy.FeaturesToJSON_conversion(os.path.join(“outgdb.gdb”, “myfeatures”), “output.json”)
ogr2ogr -f GeoJSON output.json input.kml
View the tools here: http://kml.jessdev.org
Three of my KML tools are now stable and in Github. These are actually displayed via the static site generator Hugo (read about the Hugo CLI here), which is sitting in the shiny server (port 3838) next to the apps. Messy, but it will do for now.