Trans: Latin prefix implying "across" or "Beyond", often used in gender nonconforming situations – Scend: Archaic word describing a strong "surge" or "wave", originating with 15th century english sailors – Survival: 15th century english compound word describing an existence only worth transcending.

Category: DIY

Prius, Printers

Add EV only mode button for 2009 Prius-

Pinouts and wiring reference here:

Big shiny EV mode button in the Prius!

Fusion 360 files here:

Files uploaded to thingiverse here:

xposted to prius chat too:

PLA & Carbon Polycarbonate button housings:

While we're at it....

See more notes on D&M 3d Printer stuff on github here:
...and here:

While at a safe distance…

...Playing with Bandlab's Sonar reboot --> morning metal ....Frankly the whole suite (yes, Melodyne, the whole nine yards) is way better than when it was with the late Cakewalk, and its all free now. PSA!

...Unexpected success with
Nylon 680 FDA {3mm @ .8} for some rather delicate parts:

...Yet another improved pi monitoring sketch, currently in production w/ polycarbonate & 1/4"... ...or to quote Mad-eye Moody, "CONSTANT VIGILANCE!" 🙂


D&M Shields – Fusion 360

As of 4/4/20, we are busy 3d printing our rigid shield design, efficiently hacked into its current form by Bret here at D&M. click here to visit or download the Fusion files!

The flat, snap-fit nature of this design can easily be lasercut as well- the varied depths of the printed model are just an effort to minimize excess plastic and print time.

More to come on the laser side of things- in addition to the massive time savings- like <20 seconds vs. >3 hours per shield- we can use far cheaper and varied materials with the addition of our sterilizable and durable UV resins and coatings. Similarly, lasercut stock + resin offers the possibility quick adaptation and derivative design, such as [flexible]( UV cured forms.

JDK Management in R

Quickly & forcefully manage extra JDKs in base R
Simplify rJava woes

# get this script:

rJava is depended upon by lots of libraries- XLConnect, OpenStreetMap, many db connectors and is often needed while scripting with GDAL.

library(XLConnect)   # YMMV

Errors while importing a library with depending on a JDK are many, but can (usually) be resolved by reconfiguring the version listed somewhere in the error.

On mac OSX (on Mojave at least), check what you have installed here- (as admin, this is a system path) :

sudo ls  "/Library/Java/JavaVirtualMachines/ 

I seem to usually have at least half a dozen or more versions in there, between Oracle and openJDK. Being Java, these are basically sandboxed as JVMs and are will not get in each others way.


Unlike JDK configuration for just about everything else, aliasing or exporting a specific release to $PATH will not cut it in R. The shell command to reconfigure for R-

sudo R CMD javareconf

...seems to always choose the wrong JDK. Renaming, hiding, otherwise trying to explain to R the one I want (lib XLConnect currently wants none other than Oracle 11.0.1) is futile.
The end-all solution for me is usually to temporarily move other JDKs elsewhere.
This is not difficult to do now and again, but keeping a CLI totally in R for moving / replacing JDKs makes for organized scripting.

 JDKmanager help: 
 (args are not case sensitive) 
 (usage: `sudo rscript JDKmanager.R help`) 

 list    :: prints contents of default JDK path and removed JDK path 
 reset   :: move all JDKs in removed JDK path back to default JDK path 
 config ::  configure rJava.  equivalent to `R CMD javareconf` in shell 

 specific JDK, such as 11.0.1, 1.8,openjdk-12.0.2, etc: 
    searches through both default and removed pathes for the specific JDK.  
    if found in the default path, any other JDKs will be moved to the `removed JDKs` directory. 
    the specified JDK will be configured for rJava.

Decentralized Pi Video Monitoring w/ motioneye & BATMAN

Visit the me here on Github
Added parabolic musings 10/16/19, see below

...On using motioneye video clients on Pi Zeros & Raspbian over a BATMAN-adv Ad-Hoc network

link: motioneyeos
link: motioneye Daemon
link: Pi Zero W Tx/Rx data sheet:
link: BATMAN Open Mesh

This implementation of motioneye is running on Raspbian Buster (opposed to motioneyeos).

Calculating Mesh Effectiveness w/ Python:
Please take a look at the idea here is one should be able to estimate the maximum plausible distance between mesh nodes before setting anything up. It can be run with no arguments-


...with no arguments, it should use default values (Tx = 20 dBm, Rx = |-40| dBm) to print this:

you can add (default) Rx Tx arguments using the following syntax:
                 python3 20 40
                 python3 <Rx> <Tx>                 

 57.74559999999994 ft = max. mesh node spacing, @
 Rx = 40
 Tx = 20

Regarding the Pi:
The Pi Zero uses an onboard BCM43143 wifi module. See above for the data sheet. We can expect around a ~19 dBm Tx signal from a BCM43143 if we are optimistic. Unfortunately, "usable" Rx gain is unclear in the context of the Pi.

Added 10/16/19:
Notes on generating an accurate parabolic antenna shape with FreeCAD’s Python CLI:

For whatever reason, (likely my own ignorance) I have been having trouble generating an accurate parabolic dish shape in Fusion 360 (AFAICT, Autodesk is literally drenching Fusion 360 in funds right now, I feel obligated to at least try). Bezier, spline, etc curves are not suitable!
If you are not familiar with FreeCAD, the general approach- geometry is formed through fully constraining sketches and objects- is quite different from Sketchup / Tinkercad / Inventor / etc, as most proprietary 3d software does the “constraining” of your drawings behind the scenes. From this perspective, you can see how the following script never actually defines or changes the curve / depth of the parabola; all we need to do is change how much curve to include. A wide, shallow dish can be made by only using the very bottom of the curve, or a deep / narrow dish by including more of the ever steepening parabolic shape.

import Part, math

# musings derived from:

# thinking about units here:
tu = FreeCAD.Units.parseQuantity

def mm(value):
    return tu('{} mm'.format(value))

rs = mm(1.9)
thicken = -(rs / mm(15)) 

# defer to scale during fitting / fillet elsewhere 
# create a parabola with the symmetry axis (0,0,1)

# get only the right part of the curve

# make a solid

# apply a thickness


# Fill screen:
# Remove Part in default env:

FWIW, here is my Python implimentation of a Tx/Rx "Free Space" distance calulator-

from math import log10
from sys import argv
# estimate free space dBm attenuation:
# ...using wfi module BCM43143:

Tx = 19~20 dBm
Rx = not clear how low we can go here

d = distance Tx --> Rx
f = frequency
c = attenuation constant: meters / MHz = -27.55; see here for more info:

f = 2400  # MHz
c = 27.55 # RF attenuation constant (in meters / MHz)

def_Tx = 20  # expected dBm transmit
def_Rx = 40  # (absolute value) of negative dBm thesh

def logdBm(num):
    return 20 * log10(num)

def maxDist(Rx, Tx):
    dBm = 0
    d = .1  # meters!
    while dBm < Tx + Rx:
        dBm = logdBm(d) + logdBm(f) - Tx - Rx + c
        d += .1  # meters!
    return d

# Why not use this with arguments Tx + Rx from shell if we want:
def useargs():
    use = bool
        if len(argv) == 3:
            use = True
        elif len(argv) == 1:
            print('\n\nyou can add (default) Rx Tx arguments using the following syntax: \n \
                python3 20 40 \n \
                python3 <Rx> <Tx> \
            use = False
            print('you must use both Rx & Tx arguments or no arguments')
            raise SystemExit
        print('you must use both Rx & Tx arguments or no arguments')
        raise SystemExit
    return use

def main():

    if useargs() == True:
        arg = [int(argv[1]), int(argv[2])]
        arg = [def_Rx, def_Tx]

    print(str('\n ' + str(maxDist(arg[0], arg[1])*3.281) + \
        ' ft = max. mesh node spacing, @ \n' + \
        ' Rx = ' + str(arg[0]) + '\n' + \
        ' Tx = ' + str(arg[1])))


Persistent, Live Ubuntu for College

Below is are live mirrors of my "PSU Hacking Club" Ubuntu repos.

Simple File Hosting

README yet to be updated with Makerspace-specific formatting

Static site built quickly with Hugo CLI

# on OSX
# get hugo

brew install hugo

# clone site

git clone
cd static-site

# Compile and compress public directory

zip -r public

# upload and host with sftp & ssh

> cd
> put

# new terminal window

# check your remote filesystem- the idea is:
> unzip
> rm -rf

visit us

Also, check out the evolving PSU Hacking Club wiki here!

xD  - Jess

Summer 2019 Update!

GIS Updates:

Newish Raster / DEM image → STL tool in the Shiny-Apps repo:

See the (non-load balanced!) live example on the Heroku page:

Summarized for a forum member here too:

CAD / CAM Updates:

Been revamping my CNC thoughts- 

Basically, the next move is a complete rebuild (primarily for 6061 aluminum).

I am aiming for:

  • Marlin 2.x.x around either a full-Rambo or 32 bit Archim 1.0 (
  • Dual endstop configuration, CNC only (no hotend support)
  • 500mm2 work area / swappable spoiler boards (~700mm exterior MPCNC conduit length)
  • Continuous compressed air chip clearing, shop vac / cyclone chip removal
  • Two chamber, full acoustic enclosure (cutting space + air I/O for vac and compressor)
  • Full octoprint networking via GPIO relays

FWIW: Sketchup MPCNC:

Also TinkerCAD version:

Electric Drivetrain Development:

BORGI / Axial Flux stuff:

Designed some rough coil winders for motor design here:


Also, an itty-bitty, skate bearing-scale axial flux / 3-phase motor to hack upon:


- Jess

Deploy Shiny R apps along Node.JS

Find the tools in action on Heroku as a node.js app!

See the code on GitHub:

After many iterations of ideas regarding deployment for a few research Shiny R apps, I am glad to say the current web-only setup is 100% free and simple to adapt.   I thought I'd go through some of the Node.JS bits I have been fussing with. 

The Current one:  

Heroku has a free tier for node.js apps.  See the pricing and limitations here: as far as I can tell, there is little reason to read too far into a free plan; they don’t have my credit card, and thy seem to convert enough folks to paid customers to be nice enough to offer a free something to everyone.  

Shiny apps- works straight from RStudio.  They have a free plan. Similar to Heroku, I can care too much about limitations as it is completely free.  

The reasons to use Node.JS (even if it just a jade/html wrapper) are numerous, though may not be completely obvious.  If nothing else, Heroku will serve it for free….

Using node is nice because you get all the web-layout-ux-ui stacks of stuff if you need them.  Clearly, I have not gone to many lengths to do that, but it is there.

Another big one is using node.js with Electron. The idea is a desktop app framework serves up your node app to itself, via the chromium.  I had a bit of a foray with Electron- the node execa npm install execa package let me launch a shiny server from electron, wait a moment, then load a node/browser app that acts as a interface to the shiny process.  While this mostly worked, it is definitely overkill for my shiny stuff.  Good to have as a tool though.


Recycled Personal “Cloud Computing” under NAT

As many may intuit, I like the AWS ecosystem; it is easy to navigate and usually just works.  

...However- more than 1000 dollars later, I no longer use AWS for most things....


My goals: 

Selective sync:  I need a unsync function for projects and files due to the tiny 256 SSD on my laptop (odrive is great, just not perfect for cloud computing.

Shared file system:  access files from Windows and OSX, locally and remote

Server must be headless, rebootable, and work remotely from under a heavy enterprise NAT (College)

Needs more than 8gb ram

Runs windows desktop remotely for gis applications, (OSX on my laptop)


Have as much shared file space as possible: 12TB+


Server:  recycled, remote, works-under-enterprise-NAT:

Recycled Dell 3010 with i5:

- Cost: $75 (+ ~$200 in windows 10 pro, inevitable license expense) 

free spare 16gb ram laying around, local SSD and 2TB HDD upgrades

- Does Microsoft-specific GIS bidding, can leave running without hampering productivity

Resilio (bittorrent) Selective sync:

- Cost: $60

- p2p Data management for remote storage + desktop

- Manages school NAT and port restrictions well (remote access via relay server)

Drobo 5c:

Attached and syncs to 10TB additional drobo raid storage, repurposed for NTFS

  • Instead of EBS (or S3)


What I see:  front end-

Jump VNC Fluid service:

- Cost: ~$30

- Super efficient Fluid protocol, clients include chrome OS and IOS,  (with mouse support!)

- Manages heavy NAT and port restrictions well

- GUI for everything, no tunneling around a CLI

  • Instead of Workspaces, EC2

Jetbrains development suite: (OSX)

- Cost:  FREE as a verified GitHub student user.

- PyCharm IDE, Webstorm IDE

  • Instead of Cloud 9


Total (extra) spent: ~$165

(Example:  my AWS bill for only October was $262)



New App:  KML Search and Convert

Written in R; using GDAL/EXPAT libraries on Ubuntu and hosted with AWS EC2.

New App:  KML Search and Convert

Here is an simple (beta) app of mine that converts KML files into Excel-friendly CSV documents.  It also has a search function, so you can download a subset of data that contains keywords.   🙂

The files will soon be available in Github.

I'm still working on a progress indicator; it currently lets you download before it is done processing.   Know a completely processed file is titled with "kml2csv_<yourfile>.csv".

...YMMV.  xD

GDAL for R Server on Ubuntu – KML Spatial Libraries and More

GDAL for R Server on Red Hat Xenial Ubuntu - KML Spatial Libraries and More

If you made the (possible mistake) of running with a barebones Red Hat Linux instance, you will find it is missing many things you may want in R.   I rely on GDAL (the definitive Geospatial Data Abstraction Library) on my local OSX R setup, and want it on my server too.  GDAL contains many libraries you need to work with KML, RGDAL, and other spatial packages.  It is massive and usually take a long time to sort out on any machine.

These notes assume you are already involved with a R server (usually port 8787 in a browser).  I am running mine from an EC2 instance with AWS.

! Note this is a fresh server install, using Ubuntu; I messed up my original ones while trying to configure GDAL against conflicting packages. If you are creating a new one, opt for at least a T2 medium (or go bigger) and find the latest Ubuntu server AMI.  For these instructions, you want an OS that is as generic as possible.

On Github:

From Bash:

# SSH into the EC2 instance: (here is the syntax just in case)

#ssh -i "/Users/YourSSHKey.pem"

sudo su -

apt-get update

apt-get upgrade

nano /etc/apt/sources.list

#enter as a new line at the bottom of the doc:

deb xenial/

#exit nano


chmod 777



From SSH:

# SSH into the EC2 instance: (here is the syntax just in case)

ssh -i "/Users/YourSSHKey.pem"

# if you can, become root and make some global users- these will be your access to

# RStudio Server and shiny too!

sudo su –

adduser <Jess>

# Follow the following prompts carefully to create the user

apt-get update

nano /etc/apt/sources.list

# enter as a new line at the bottom of the doc:

deb xenial/

# exit nano

# Start, or try bash:

apt-get install r-base

apt-get install r-base-dev

apt-get update

apt-get upgrade


tar xvf gdal-2.3.1.tar.gz

cd  gdal-2.3.1

# begin making GDAL: this all takes a while

./configure  [if your need proper kml support (like me), search on configuring with expat or libkml.   There are many more options for configuration based on other packages that can go here, and this is the step to get them in order...]

sudo make

sudo make install

cd # Try entering R now and check the version!

# Start installing RStudio server and Shiny

apt-get update

apt-get upgrade
sudo apt-get install gdebi-core
sudo gdebi rstudio-server-1.1.456-amd64.deb

# Enter R or go to the graphical R Studio installation in your browser


# Authenticate if using the graphical interface using the usr:pwd you defined earlier

# this will take a long time


# Note any errors carefully!



install.packages(c("data.table", "tidyverse”, “shiny”)  # etc

Well, there you have it!



##Later, ONLY IF you NEED Anaconda, FYI:

# Get Anaconda: this is a large package manager, and is could be used for patching up missing # dependencies:

#Use  "ls" followed by rm -r <anaconda> (fill in with ls results) to remove conflicting conda

# installers if you have any issue there, I am starting fresh:

mkdir binconda

# *making a weak attempt at sandboxing the massive new package manager installation*

cd binconda
# install and follow the prompts

# Close the terminal window completely and start a new one, and ssh back to where you left

# off.  Conda install requires this.

# open and SSH back into your instance.  You should now have either additional flexibility in

# either patching holes in dependencies, or created some large holes in your server.  YMMV.

### Done

Red Hat stuff:

Follow these AWS instructions if you are doing something else:

See my notes on this here:

and notes on Shiny server:

GDAL on Red Hat:- Existing threads on this:

This is a nice short thread about building from source:

neat RPM package finding tool, just in case:

Info on the LIBKML driver if you end up with issues there:


I hope this is useful- GDAL is important and best to set it up early.  It will be a pain, but so is losing work while trying to patch it in later.  xD




INFO: Deploy a Shiny web app in R using AWS (EC2 Red Hat)

Info on deploying a Shiny web app in R using AWS (EC2 Redhat)

As a follow-up to my post on how to create an AWS RStudio server, the next logical step is to host some useful apps you created in R for people to use.  A common way to do this is the R-specific tool Shiny, which is built in to RStudio.  Learning the syntax to convert R code into a Shiny app is rather subtle, and can be hard.  I plan to do a more thorough demo on this- particularly the use of the $ symbol, as in “input$output”- later. 🙂


It turns out hosting a Shiny Web app provides a large number of opportunities for things to go wrong….  I will share what worked for me.  All of this info is accessed via SSH, to the server running Shiny and RStudio.


I am using the AWS “Linux 2” AMI, which is based on the Red Hat OS.  For reference, here is some extremely important Red Hat CLI language worth being familiar with and debugging:


sudo yum install” and “wget” are for fetching and installing things like shiny.  Don’t bother with instructions that include “apt-get install”, as they are for a different Linux OS!


sudo chmod -R 777” is how you change your directory permissions for read, write, and execute (all of those enabled).  This is handy if your server disconnecting when the app tries to run something- it is a simple fix to a problem not always evident in the logs.  The default root folder from which shiny apps are hosted and run is “/srv/shiny-server” (or just “/srv” to be safe).


nano /var/log/shiny-server.log” is the location of current shiny logs.


sudo stop shiny-server” followed by “sudo start shiny-server” is the best way to restart the server- “sudo restart shiny-server” is not a sure bet on any other process.  It is true, other tools like a node.js server or nginx could impact the success of Shiny- If you think nginx is a problem, “cd /ect/nginx” followed by “ls” will get you in the right direction.  Others have cited problems with Red Hat not including the directories and files at “/etc/nginx/sites-available”.  You do not need these directories.  (though they are probably important for other things).


sudo rm -r” is a good way to destroy things, like a mangled R studio installation.  Remember, it is easy enough to start again fresh!  🙂


sudo nano /etc/shiny-server/shiny-server.conf” is how to access the config file for Shiny.  The fresh install version I used did not work!  There will be lots of excess in that file, much of which can causes issues in a bare-bones setup like mine.  One important key is to ensure Shiny is using a root user- see my example file below.  I am the root user here (jess)- change that to mirror- at least for the beginning- the user defined as root in your AWS installation.  See my notes HERE on that- that is defined in the advanced settings of the EC2 instance.


BEGIN CONFIG FILE:   (or click to download) *Download is properly indented

# Define user: this should be the same user as the AWS root user!
run_as jess;
# Define port and where the home (/) directory is
# Define site_dir/log_dir - these are the defaults
listen 3838;
location / {
site_dir /srv/shiny-server;
log_dir /var/log/shiny-server;
directory_index on;


Well, the proof is in the pudding.   At least for now, you can access a basic app I made that cleans csv field data files that where entered into excel by hand.  They start full of missing fields and have a weird two-column setup for distance- the app cleans all these issues and returns a 4 column (from 5 column) csv.

Download the test file here:   2012_dirt_PCD-git

And access the app here:  Basic Shiny app on AWS!

Below is an iFrame into the app, just to show how very basic it is.  Give it a go!


Off-Grid File Sharing with SAMBA / GL.iNet

Note:  SMB / SharePoint is surely better with a proper server/computer.  A Raspberry Pi running OpenMediaVault (Debian) is a more common and robust option (still 5v low power).

If you are actually in an "it must done in OpenWRT" scenario, Click Here for my Samba config file: OpenWRT_Samba-config and see below.  Also, please use a NTFS or EX4 format.  🙂


...While my sharing method wasn't actually adopted by others, I still think it is good to know!


How to Query KML point data as CSV using QGIS and R

How to Query KML point data as CSV using QGIS and R

Here you can see more than 800 points, each describing an observation of an individual bird.  This data is in the form of KML, a sort of XML document from Google for spatial data.


I want to know which points have “pair” or “female” in the description text nodes using R.  This way, I can quickly make and update a .csv in Excel of only the paired birds (based on color bands).



Even if there was a description string search function in Google Earth Pro (or other organization-centric GIS/waypoint software), this method is more

robust, as I can work immediately with the output as a data frame in R, rather than a list of results.


First, open an instance of QGIS.  I am running ~2.8 on OSX.  Add a vector layer of your KML.

“Command-A” in the point dialog to select all before import!

Next, under “Vector”, select “Merge vector layers” via Data Management Tools.


Select CSV and elect to save the file instead of use a temporary/scratch file (this is a common error).

Open your csv in Excel for verification! 







The R bit:

# query for paired birds

#EDIT:  Libraries

data <- data.frame(fread("Bird_CSV.csv"))

pair_rows <- contains("pair", vars = data$description)

fem_rows <- contains("fem", vars = data$description)

result <- combine(pair_rows, fem_rows)

result <- data[result,]

write_csv(result, "Paired_Birds.csv")











Solar upgrades!

Solar upgrades!

Incredibly, the hut we are working from actually had another solar panel just laying around.  🙂
This 50w square panel had a junction box with MC4 connectors, the standard for small scale solar installations.  As I was unsure how to know when we are running low on electricity reserves, I decided to make some adjustments.

Additional 50w solar panel

(Everything is still solder, hot glue, alligator clips, and zip-ties I’m afraid…)
I traded my NEMA / USA two-prong connection with two MC4 splitters, such that both panels can run in parallel (into a standard USA 110v extension cord that goes into our hut).  This way we should make well over one of the two 35ah batteries-worth of electricity a day.

Dual MC4 splitters to extension cord

I also added a cheap 12v battery level indicator.  It is not very accurate (as it fluctuates with solar input) but it does give us some insight about how much "juice" we have available.  (I also wired and glued the remote-on switch to the back of the input for stability.)

Added battery indicator and button


840 Watts of Solar Power!

Equipment used:

Inverter/PWM Controller:

2x 35ah Batteries:

100w solar panel:

We need power!  While doing bird research in the wilds of northern NH, it became evident we needed electricity to power computers, big cameras, and phones/GPS units.

Below is a table of the system and our expected electricity needs:

System Solar 100w 35ah universal (x2)
Ah per day: 33.33333333 35 TOTAL Ah Reserve: 70
V 12 12 Parallel wiring: 12v
Wh in: 400 420 TOTAL Wh Reserve: 840
W 100
Cost $105.00 $64.00
ah/$ 2
Sun Hour / Multiplier 4 2
Need/Day Wh multiplier consump. in Wh = 259.36
Computer 100 2.5 250
iPhone 1.7 2 3.4
AAs 11.2 0.3 3.36
Camera 2.6 1 2.6

*The milk crate system below can charge a 100 watt MacBook Pro around 8-9 times from being completely empty.  

**Remember:  V*A=W,  W/V=A, and Watts over time is Wh.  


+/- relates to size of standard prongs

Parallel maintains 12v but doubles Ah. (Series would go to 24v at 35ah)


Intro to the AWS Cloud 9 IDE

The Cloud 9 IDE is the fastest way I have come up with to develop web-based or otherwise "connected" programs.    Because it lives on a Linux-based EC2 server on AWS, running different node, html, etc programs that rely on a network system just work- it is all already on a network anyway.   🙂  There is no downtime trying to figure out your WAMP, MAMP, Apache, or localhost situation.

Similarly, other network programs work just as well-  I am running a MySQL server over here (RDS), storage over there (S3), and have various bits in Github and locally.   Instead of configuring local editors, permissions, and computer ports and whatnot, you are modifying the VPC security policies and IAM groups- though generally, it just works.

Getting going:   The only prerequisite is you have an AWS account.  Students:  get $40 EC2 dollars below:
Open the cloud 9 tab under services.



Setup is very fast- just know if others are going to be editing to, understand the IAM policies and what VPC settings you actually want.


Know this ideally a browser-based service; I have tried to come up with a reason a SSH connection would be better and didn't get any where.

For one person, micro is fine.   Know these virtual "RAMs" and "CPUs" are generous....





The default network settings are set up for you.   This follows good practice for one person; more than that (or if you are perhaps a far-travelling person) note these settings.  They are always editable under the VPC and EC2 instance tabs.



That's it!   Other use things to know:

This is a linux machine maintained by Amazon.   Packages you think should work and be up to date (arguably like any other linux machine I guess...)  may not be.  Check your basics like the NPM installer and versions of what your going to be working on, it very well may be different than what you are used to.

In the editor:

You have two panels of workspace in the middle- shown is node and HTML.   Everything is managed by tabs- all windows can have as much stuff as you want this way.

Below there is a "runner" (shown with all the default options!) and a terminal window.  Off to the left is a generic file manager.



I hope this is useful, it sure is great for me.


Using ESRI ArcGIS / ArcMap in the AWS Cloud

Selling AWS to... myself   🙂

Why struggle with underpowered local machines and VMs or watered-down web platforms for heavy lifting,  learning and work?

In addition to using ESRI software on mac computers, I am a big fan of the AWS WorkSpaces service (in addition to all their other developer tools, some of which are map-relevant: RDS for SQL and EC2 Redhat servers  for data management for example ).

Basically, for between ~$20 and ~$60 a month (Max, and not factoring in EDU discounts!), a user gets to use a well-oiled remote desktop.   You can download and license desktop apps like ArcMap and GIS products, file managers, and more from any computer connected to the internet.  This service is not very savvy; you make/receive a password and log right in.

A  big plus here of course is the Workspaces Application Manager (WAM); small sets of licenses can be administered in the same way desktops would, with extra easiness due to the "they are already really the same cloud thing any way".

Another plus is any client- netbook, macbook, VM, etc- will work equally well.  In this regard it can be a very cheap way to get big data work done on otherwise insufficient machines.  Local storage and file systems work well with the client application, with the caveat being network speed.








How to make a AWS R server

When you need an R server and have lots of data to process, AWS is a great way to go.   Sign up of the free tier and poke around!

Creating an AWS Rstudio server: - using both the R snippet (works but the R core bits are NOT present and it will not work yet) and the JSON snippet provided - the suite being installed

Follow most of the AWS blog AMI info, with the following items:

AMI:  Amazon Linux 2 (more packages and extras v. standard)  

  • t2.micro (free tier)
  • IAM policy follows AWS blog JSON snippet
  • Security Policy contains open inbound ports 22, 8787, 3838 (the latter two for R server specific communication)
  • Append user, username:password in the blog post’s initial r studio install text (pasted into the “advanced” text box when completing the AMI setup


SSH into the EC2 instance

sudo yum install –y

sudo yum-config-manager --enable epel

sudo yum repolist


sudo yum update -y

sudo yum install -y R

sudo rstudio-server verify-installation


Access the graphical R server:

In a web browser, tack on “:8787” to the end of the Instance’s public “connect” link.  If it doesn’t load a login window (but seems to be trying to connect to something) the security policy is probably being overzealous……..


Notes on S3-hosted data:

  • S3 data is easiest to use if it is set to be public.
  • There are s3-specific tools for R, accessible as packages from CRAN directly from the R interface
  • Note data (delimited text at least) hosted in S3 will behave differently than it does locally, e.g. spaces, “na”, “null” need to be “cleaned” in R before use.  


There we have it!



Boutique everything: When The Hobby Grows Up


Food.  Clothes.  Art.  Musical Equipment.  Consumer Design and Products.  Can a mere citizen enter the fray of cutting edge design and production?

As a hobbyist designer with a passion for, say, high end audio, the options for actually producing a quality, well executed product may seem lucrative and completely not worth while.  “It’s just a hobby” some say, or, “The cost of manufacturing tools or a bid at the factory floor in China are way bigger than my love for sound”, or, “nobody would ever purchase my design, there are so many other companies who have done this longer than me”.  These answers are all valid, but may not be the complete picture when it comes to local, boutique production.    

Can a passionate enthusiast use makerspace technology and peer support to bring small batches/limited runs of high quality products to a localized, niche market?

Could a food connoisseur use networking services to construct a timely supply chain for seasonal meals at local restaurants or cafes?

Would a local tailor be able to source materials and equipment to realise the material science and design they have always dreamed of for a coat in small batches?

Using cutting-edge makerspaces and the subsequent networking opportunities, I believe producing small batches of high quality goods and utilizing a local business/niche marketing approach or distribution system could increase the innovation and quality of any given local economy.

The idea of “group buys” is elementary in DIY audio circles.  Folks going in on a board design for fabrication will often drum up some enthusiasm on the internet or elsewhere, in a move to offset the high entry price of board manufacture.  I have noticed some folks take it a step further, and will not only complete the project they intended to, but perfect the project into a product and do a run of a few pieces to a few dozen and beyond.  This model is actually a great asset to the developing maker; offsetting the cost (or even making a few coins in profit!) of larger projects inherently makes bigger and better projects feasible.   

The folks building audio equipment in their basement, garage, or bedroom are, in essence, artists exploring art through avenues otherwise devoid of artisan qualities.  It is easy to reproduce sound commercially- Apple supplies those iBud-earPod-headBeats with every phone they sell.  Yet, the people in DIY audio are taking on audio components exactly how a great potter would craft a new bowl or coffee cup; functional sculpture, art in one of its oldest forms.

Screenshot of Jazzman's blog (

Below is a picture of one of the quasi-famous Jazzman ESL panels.  A true labor of love and work of art, Charlie has pioneered the processes required to build a ultra-top-end electrostatic loudspeaker, in the confines of the home, job, and hobby budget.  Now, Jazzman's speakers are built almost exclusively by hand, using careful measurement techniques to ensure tight tolerances instead of using machines that could do this automatically- making these panels really one of a kind and certainly not an option for even the most ambitious cottage industry entrepreneurs.

I bring these panels up simply to show what home-brew audio (or any labor-of-love-hobby) is about: craftsmanship, dedication, and a desire to learn.  falling right in with home-brew beer, local pottery, cooking, painting, tailoring,  and more, one can see from this artisanal point of view the value in these kinds of work.

Unlike some of these art forms found exclusively in art shows and galleries, only recently has there been an opportunity for individuals to reverse the commercialization of otherwise beautiful hobbies.

 Commercialization and hobbies: can we have both?

You bet.  As individuals get better at their craft and further down the hobbyist rabbit hole, (I personally) wonder where to draw the line as a hobby.  Don't!  We develop makerspaces to propel creation into hyperdrive; the next and last step in completing the artist's high-end project circle is selling the last project so the new batch can be justified.  Because rapid fabrication and makerspaces are "a thing" now, people need to understand what comes next with all those creative and production juices flowing.  I think many makers may not approach their custom brazed bikes,  amazing wooden trinkets, or tube guitar amps from the view a painter would monetize paintings- but they (we) should.  Art stores, art shows, audio meetups, DIY ecommerce sites, Etsy, craft conventions...  These are real venues we should be adding to our vocabulary as makers.  It is the last step to a full circle justification, and for me (in my hobby bird photography work for sure) it simply feels amazing to be at that stage of chatting it up with locals about where I took the picture of the merganser.  It takes way more effort than I or my fellow artists will let on, (learning high-end home printing, commerce, getting a materials supplier, website, etc) and marketing/selling is not NEARLY as glamorous as hacking away at our craft.

But, at the end of the day, this is the right thing to do.  Showing others through commerce the true value of maker craft not only educates and enriches, but increases the value in our local economies and local-maker-wizardry.

DIY MrSpeakers “Open Alpha” 3d Printed Headphones: Dan Did It Again!


Headfi forum with release and build notes

MrSpeakers headphone plug terminals 

Acoustic wool I used

Brainwaves earpads

Prices for Prusa i3 on Ebay

Other materials include a recycled Grado cable, spare bolts from my Prusa i3 printer (and obviously my i3 printer for printing, too) and Hatchbox black PLA.

I like the Hatchbox stuff, thus far it has proven to be affordable and reliable, even at low printing temperature (180 degrees is all I got).

One side, stock t50rp. Otherside, Open alpha.


So:  Before I get into the build details, the bottom line is this is the definitive overhaul to complete in terms of t50rp mods.  I started with a refurbished mkiii,  and hated the crazy EQ spike at about where hihats generally reside.  This was not a subtle issue, and would give me a headache quickly.  From modding with the stock cups, I found denser damping=worse issue with treble and less bass.   Think of it like a bass trap- I used essentially felted wool, which in its raw form killed all frequencies except for the strongest... ...Which in this case was about 2.8khz to give a rough estimate.  Ouch!  After googling around, I discovered Dan was using cotton balls as the primary damping material, with a thin layer of dense acoustic mat of some kind to line the cup (helps mostly I think with leakage control and resonance from the actual plastic and flat surfaces on the cup itself).

Damping the Alphas:

I already had this fancy acoustic wool, so I figured I could make some "wool balls" by separating the dense wad into fluffy pillows.  I'd say I aired on the side of less dense- at this point I was printing my Alpha cups, and the space in there is huge, leaving ample space to layer up some of these wool balls.  I did not feel the need to line the cups with a damping mat of any kind, because my wool seemed to kill noise and reflection already like nobody's business.

Printing the cups and other parts:

...Painless, except on the inside of the cups there is a dip where the headband arm is mounted.   For no particular reason, I printed both cups without supports- but not without a large amount of "PLA spaghetti"  and the occasional emergency "duct tape the PLA spaghetti wad to the bed so it can have something to build on..."


Firstly, these sound nothing like the stock t50rp.  AT ALL.  the low end goes quite low with not a huge amount of distortion, the mids are wide and spacious, and the treble (including the significantly tamed-down spike) is springing and provides nice "pop" and sparkle.  These headphones are a pleasure to listen to- I've been doing Art Blakey and Coltrain lately because the reproduction of the jazz bass is superb,  Couple that with the expansive space where the sax and piano reside, these make a nice way to relax at the end of a day (which is how I have been relaxing each evening since I made them).  Obviously, we still have some fundamental setbacks.  The Fostex driver is unbelievably inefficient.  It takes much care to juice these properly (I like them through my e12 portable amp actually,  because I can crank the input volume on that with very little distortion for quite a bit).   Additionally, there is a limit to how much detail we can siphon out of the driver; this uber mod definitely maxes out the clarity and definition this driver can provide.  For example, the successor to this headphone when it was made commercially by Dan/MrSpeakers is the AEON (still in preorder mode at the time of writing)- a completely in house design trickling technology down from the company's acclaimed ETHER headphones.  I am actually lucky enough to have had a few hours to play with them and chat with the inventor (Dan)- quite simply, the clarity and silkiness of the AEON demolishes the notion of clarity with my Open Alphas.  THAT SAID an $800 carbon fiber headphone invented from the ground up by Dan (who maintains the highest regards even from competitors (ZMF, even hifiman reps) as the most dedicated headphone creator) is obviously not really competing against a headphone a made for <$300 INCLUDING the 3d printer and donor headphones.... 🙂


The Prusa i3 Update: My ~$150 3d Printer


the prusa i3 on ebay

Cura 3d slicing and printing gcode software

Marlin firmware for "real" arduinos on Github

I purchased my 3d printer new from the USA for exactly $155.   That is $50 less than a single DUM headphone cable from MrSpeakers.  It did not work until much duct tape had been applied and zip ties zipped and jigs rigged, but overall it wasn't that bad.  Now that I'm up and running, there are a few things to note:

  1. It can barely heat PLA (at 180 c), so forget about ABS.  I have a hunch this may have something to do with the wimpy PSU it came with.  I just so happen to have a full size version of the same PSU, rated at 30 amps @ 12 volts- someday I will try swapping that one in and see If I can get more temperature.  The firmware has ABS settings, it just will never reach the 225 degrees called for.
  2. The firmware is very firmly stuck in the arduino.  I spent hours upon hours trying to flash this arduino with a custom Marlin software, as I have been taught from my MPCNC project, to no avail.  The best thing to do is do all the tweaks in Cura, load the resulting gcode into a sd card, and run the printer off that.  The USB was getting wonky on me (I can't remember exactly what it was doing, it just wasn't the right thing). plus with the sd card you can put the printer wherever you want.
  3. The build quality... there is no quality.  be prepared to make up the assembly and troubleshooting as you go.
  4. "heated" bed- the bed game is rather weak.  things sometimes stick, but usually don't.  I learned the best way to get prints to stick is with washable glue sticks.  Long story short (and many prints that skittered off the bed before they were done), I need gluesticks.

Once I procure some glue sticks, I'll be printing out my "real" machine.  MPCNC, Here we come!

(beta) – How to build “The Phone Charger”


Imagine, any AA or 9v battery could charge your phone and other USB gadgets.   Behold, the LM780!

This is a voltage regulator chip.  Stick between 5 and, say, 9 volts in one end and huzzah! (about) 5 volts pops out the other end.  these cost a few dimes and can be had on ebay 10 for 4$.

Below is the BOM:  (to make 10 chargers!)

Sourced via ebay:
$4:  10x of LM780 5v
$4:  10x of little toggle switches
$4:  10x of 9v snap connectors (I used a 6v supply from AAs but as far as the chip is concerned it doesn't matter to much.  (I am reading 4.~ volts and enough power to charge a phone out of mine now)
$0: PCB-  Technically they aren't even needed, but for our uses they make the soldering and building more straightforward.
$1:  10x of USB A ports from China
$?: Casing- be creative.  I want to make one in a shrink-wrapped tube or 3d printed box or something.
First, lay the parts out like so:
...notice how when facing the shiny side of the chip, the left leg is in line with the left hand side of the USB port when looking into the port:

Then arrange them like so:

...Notice I squished the power wires under the bent pins.  RED GOES TO THE PIN ON THE RIGHT when facing the shiny side, and BLACK GOES TO THE MIDDLE ONE.

...The remaining leg is attached to the far left pin on the USB port (when facing the chip's shiny side remember).

Then it works!!  YAY!

Here I verify it works by charging my commercial usb charger with my DIY duct tape one:

Good luck!