Trans: Latin prefix implying "across" or "Beyond", often used in gender nonconforming situations – Scend: Archaic word describing a strong "surge" or "wave", originating with 15th century english sailors – Survival: 15th century english compound word describing an existence only worth transcending.

Category: Good Ideas (Page 3 of 4)

New App:  KML Search and Convert

Written in R; using GDAL/EXPAT libraries on Ubuntu and hosted with AWS EC2.

New App:  KML Search and Convert

Here is an simple (beta) app of mine that converts KML files into Excel-friendly CSV documents.  It also has a search function, so you can download a subset of data that contains keywords.   🙂

The files will soon be available in Github.

I’m still working on a progress indicator; it currently lets you download before it is done processing.   Know a completely processed file is titled with “kml2csv_<yourfile>.csv”.

…YMMV.  xD

GDAL for R Server on Ubuntu – KML Spatial Libraries and More

GDAL for R Server on Red Hat Xenial Ubuntu – KML Spatial Libraries and More

If you made the (possible mistake) of running with a barebones Red Hat Linux instance, you will find it is missing many things you may want in R.   I rely on GDAL (the definitive Geospatial Data Abstraction Library) on my local OSX R setup, and want it on my server too.  GDAL contains many libraries you need to work with KML, RGDAL, and other spatial packages.  It is massive and usually take a long time to sort out on any machine.

These notes assume you are already involved with a R server (usually port 8787 in a browser).  I am running mine from an EC2 instance with AWS.

! Note this is a fresh server install, using Ubuntu; I messed up my original ones while trying to configure GDAL against conflicting packages. If you are creating a new one, opt for at least a T2 medium (or go bigger) and find the latest Ubuntu server AMI.  For these instructions, you want an OS that is as generic as possible.

On Github:

https://github.com/Jesssullivan/rhel-bits

From Bash:

# SSH into the EC2 instance: (here is the syntax just in case)

#ssh -i “/Users/YourSSHKey.pem” ec2-user@yourAWSinstance.amazonaws.com

sudo su –

apt-get update

apt-get upgrade

nano /etc/apt/sources.list

#enter as a new line at the bottom of the doc:

deb https://cloud.r-project.org/bin/linux/ubuntu xenial/

#exit nano

wget https://raw.githubusercontent.com/Jesssullivan/rhel-bits/master/xen-conf.sh

chmod 777 xen-conf.sh

./xen-conf.sh

Or…

From SSH:

# SSH into the EC2 instance: (here is the syntax just in case)

ssh -i “/Users/YourSSHKey.pem” ec2-user@yourAWSinstance.amazonaws.com

# if you can, become root and make some global users- these will be your access to

# RStudio Server and shiny too!

sudo su –

adduser <Jess>

# Follow the following prompts carefully to create the user

apt-get update

nano /etc/apt/sources.list

# enter as a new line at the bottom of the doc:

deb https://cloud.r-project.org/bin/linux/ubuntu xenial/

# exit nano

# Start, or try bash:

apt-get install r-base

apt-get install r-base-dev

apt-get update

apt-get upgrade

wget http://download.osgeo.org/gdal/2.3.1/gdal-2.3.1.tar.gz

tar xvf gdal-2.3.1.tar.gz

cd  gdal-2.3.1

# begin making GDAL: this all takes a while

./configure  [if your need proper kml support (like me), search on configuring with expat or libkml.   There are many more options for configuration based on other packages that can go here, and this is the step to get them in order…]

sudo make

sudo make install

cd # Try entering R now and check the version!

# Start installing RStudio server and Shiny

apt-get update

apt-get upgrade
sudo apt-get install gdebi-core
wget https://download2.rstudio.org/rstudio-server-1.1.456-amd64.deb
sudo gdebi rstudio-server-1.1.456-amd64.deb

# Enter R or go to the graphical R Studio installation in your browser

R

# Authenticate if using the graphical interface using the usr:pwd you defined earlier

# this will take a long time

install.packages(“rgdal”)

# Note any errors carefully!

Then:

install.packages(“dplyr”)

install.packages(c(“data.table”, “tidyverse”, “shiny”)  # etc

Well, there you have it!

-Jess

Extras:

##Later, ONLY IF you NEED Anaconda, FYI:

# Get Anaconda: this is a large package manager, and is could be used for patching up missing # dependencies:

#Use  “ls” followed by rm -r <anaconda> (fill in with ls results) to remove conflicting conda

# installers if you have any issue there, I am starting fresh:

mkdir binconda

# *making a weak attempt at sandboxing the massive new package manager installation*

cd binconda
wget http://repo.continuum.io/archive/Anaconda2-4.3.0-Linux-x86_64.sh
# install and follow the prompts
bash Anaconda2-5.2.0-Linux-x86_64.sh

# Close the terminal window completely and start a new one, and ssh back to where you left

# off.  Conda install requires this.

# open and SSH back into your instance.  You should now have either additional flexibility in

# either patching holes in dependencies, or created some large holes in your server.  YMMV.

### Done

Red Hat stuff:

Follow these AWS instructions if you are doing something else:

https://aws.amazon.com/blogs/big-data/running-r-on-aws/

See my notes on this here:

https://www.transscendsurvival.org/2018/03/08/how-to-make-a-aws-r-server/

and notes on Shiny server:

https://www.transscendsurvival.org/2018/07/16/deploy-a-shiny-web-app-in-r-using-aws-ec2-red-hat/

GDAL on Red Hat:- Existing threads on this:

https://gis.stackexchange.com/questions/120101/building-gdal-with-libkml-support/120103#120103

This is a nice short thread about building from source:

https://gis.stackexchange.com/questions/263495/how-to-install-gdal-on-centos-7-4

neat RPM package finding tool, just in case:

https://rpmfind.net/linux/rpm2html/

Info on the LIBKML driver if you end up with issues there:

http://www.gdal.org/drv_libkml.html

 

I hope this is useful- GDAL is important and best to set it up early.  It will be a pain, but so is losing work while trying to patch it in later.  xD

 

-Jess

 

INFO: Deploy a Shiny web app in R using AWS (EC2 Red Hat)

Info on deploying a Shiny web app in R using AWS (EC2 Redhat)

As a follow-up to my post on how to create an AWS RStudio server, the next logical step is to host some useful apps you created in R for people to use.  A common way to do this is the R-specific tool Shiny, which is built in to RStudio.  Learning the syntax to convert R code into a Shiny app is rather subtle, and can be hard.  I plan to do a more thorough demo on this- particularly the use of the $ symbol, as in “input$output”- later. 🙂

 

It turns out hosting a Shiny Web app provides a large number of opportunities for things to go wrong….  I will share what worked for me.  All of this info is accessed via SSH, to the server running Shiny and RStudio.

 

I am using the AWS “Linux 2” AMI, which is based on the Red Hat OS.  For reference, here is some extremely important Red Hat CLI language worth being familiar with and debugging:

 

sudo yum install” and “wget” are for fetching and installing things like shiny.  Don’t bother with instructions that include “apt-get install”, as they are for a different Linux OS!

 

sudo chmod -R 777” is how you change your directory permissions for read, write, and execute (all of those enabled).  This is handy if your server disconnecting when the app tries to run something- it is a simple fix to a problem not always evident in the logs.  The default root folder from which shiny apps are hosted and run is “/srv/shiny-server” (or just “/srv” to be safe).

 

nano /var/log/shiny-server.log” is the location of current shiny logs.

 

sudo stop shiny-server” followed by “sudo start shiny-server” is the best way to restart the server- “sudo restart shiny-server” is not a sure bet on any other process.  It is true, other tools like a node.js server or nginx could impact the success of Shiny- If you think nginx is a problem, “cd /ect/nginx” followed by “ls” will get you in the right direction.  Others have cited problems with Red Hat not including the directories and files at “/etc/nginx/sites-available”.  You do not need these directories.  (though they are probably important for other things).

 

sudo rm -r” is a good way to destroy things, like a mangled R studio installation.  Remember, it is easy enough to start again fresh!  🙂

 

sudo nano /etc/shiny-server/shiny-server.conf” is how to access the config file for Shiny.  The fresh install version I used did not work!  There will be lots of excess in that file, much of which can causes issues in a bare-bones setup like mine.  One important key is to ensure Shiny is using a root user- see my example file below.  I am the root user here (jess)- change that to mirror- at least for the beginning- the user defined as root in your AWS installation.  See my notes HERE on that- that is defined in the advanced settings of the EC2 instance.

 

BEGIN CONFIG FILE:   (or click to download) *Download is properly indented


# Define user: this should be the same user as the AWS root user!
#
run_as jess;
#
# Define port and where the home (/) directory is
# Define site_dir/log_dir - these are the defaults
#
server{
listen 3838;
location / {
site_dir /srv/shiny-server;
log_dir /var/log/shiny-server;
directory_index on;
}
}

END CONFIG FILE

Well, the proof is in the pudding.   At least for now, you can access a basic app I made that cleans csv field data files that where entered into excel by hand.  They start full of missing fields and have a weird two-column setup for distance- the app cleans all these issues and returns a 4 column (from 5 column) csv.

Download the test file here:   2012_dirt_PCD-git

And access the app here:  Basic Shiny app on AWS!

Below is an iFrame into the app, just to show how very basic it is.  Give it a go!

-Jess

Off-Grid File Sharing with SAMBA / GL.iNet

Note:  SMB / SharePoint is surely better with a proper server/computer.  A Raspberry Pi running OpenMediaVault (Debian) is a more common and robust option (still 5v low power).

If you are actually in an “it must done in OpenWRT” scenario, Click Here for my Samba config file: OpenWRT_Samba-config and see below.  Also, please use a NTFS or EX4 format.  🙂

 

…While my sharing method wasn’t actually adopted by others, I still think it is good to know!

-Jess

How to Query KML point data as CSV using QGIS and R

How to Query KML point data as CSV using QGIS and R

Here you can see more than 800 points, each describing an observation of an individual bird.  This data is in the form of KML, a sort of XML document from Google for spatial data.

 

I want to know which points have “pair” or “female” in the description text nodes using R.  This way, I can quickly make and update a .csv in Excel of only the paired birds (based on color bands).

 

 

Even if there was a description string search function in Google Earth Pro (or other organization-centric GIS/waypoint software), this method is more

robust, as I can work immediately with the output as a data frame in R, rather than a list of results.

 

First, open an instance of QGIS.  I am running ~2.8 on OSX.  Add a vector layer of your KML.

“Command-A” in the point dialog to select all before import!

Next, under “Vector”, select “Merge vector layers” via Data Management Tools.

 

Select CSV and elect to save the file instead of use a temporary/scratch file (this is a common error).

Open your csv in Excel for verification! 

 

 

 

 

 

 

The R bit:

# query for paired birds

#EDIT:  Libraries
library(data.table)
library(tidyverse)

data <- data.frame(fread("Bird_CSV.csv"))

pair_rows <- contains("pair", vars = data$description)

fem_rows <- contains("fem", vars = data$description)

result <- combine(pair_rows, fem_rows)

result <- data[result,]

write_csv(result, "Paired_Birds.csv")

Tada!

 

 

 

 

 

 

 

 

-Jess

Solar upgrades!

Solar upgrades!

Incredibly, the hut we are working from actually had another solar panel just laying around.  🙂
This 50w square panel had a junction box with MC4 connectors, the standard for small scale solar installations.  As I was unsure how to know when we are running low on electricity reserves, I decided to make some adjustments.

Additional 50w solar panel

(Everything is still solder, hot glue, alligator clips, and zip-ties I’m afraid…)
I traded my NEMA / USA two-prong connection with two MC4 splitters, such that both panels can run in parallel (into a standard USA 110v extension cord that goes into our hut).  This way we should make well over one of the two 35ah batteries-worth of electricity a day.

Dual MC4 splitters to extension cord

I also added a cheap 12v battery level indicator.  It is not very accurate (as it fluctuates with solar input) but it does give us some insight about how much “juice” we have available.  (I also wired and glued the remote-on switch to the back of the input for stability.)

Added battery indicator and button

🙂
-Jess

Gathering point data using Compass 55 on Apple iOS

Keeping track of birds is tricky!

Attached is our team’s workflow with Compass 55.    From the Kml, we go into Google Earth Pro – ArcGIS Desktop (arcmap).   QGIS is sometimes used too.

 

Cheers,

-Jess

 

840 Watts of Solar Power!

Equipment used:

Inverter/PWM Controller:  http://a.co/fdl9YzI

2x 35ah Batteries: http://a.co/5JBIxTC

100w solar panel:  http://a.co/5JBIxTC

We need power!  While doing bird research in the wilds of northern NH, it became evident we needed electricity to power computers, big cameras, and phones/GPS units.

Below is a table of the system and our expected electricity needs:

System Solar 100w 35ah universal (x2)
Ah per day: 33.33333333 35 TOTAL Ah Reserve: 70
V 12 12 Parallel wiring: 12v
Wh in: 400 420 TOTAL Wh Reserve: 840
W 100
Cost $105.00 $64.00
ah/$ 2
Sun Hour / Multiplier 4 2
Need/Day Wh multiplier consump. in Wh = 259.36
Computer 100 2.5 250
iPhone 1.7 2 3.4
AAs 11.2 0.3 3.36
Camera 2.6 1 2.6

*The milk crate system below can charge a 100 watt MacBook Pro around 8-9 times from being completely empty.  

**Remember:  V*A=W,  W/V=A, and Watts over time is Wh.  

-Jess

+/- relates to size of standard prongs

Parallel maintains 12v but doubles Ah. (Series would go to 24v at 35ah)

 

Intro to the AWS Cloud 9 IDE

The Cloud 9 IDE is the fastest way I have come up with to develop web-based or otherwise “connected” programs.    Because it lives on a Linux-based EC2 server on AWS, running different node, html, etc programs that rely on a network system just work- it is all already on a network anyway.   🙂  There is no downtime trying to figure out your WAMP, MAMP, Apache, or localhost situation.

Similarly, other network programs work just as well-  I am running a MySQL server over here (RDS), storage over there (S3), and have various bits in Github and locally.   Instead of configuring local editors, permissions, and computer ports and whatnot, you are modifying the VPC security policies and IAM groups- though generally, it just works.

Getting going:   The only prerequisite is you have an AWS account.  Students:  get $40 EC2 dollars below:

https://aws.amazon.com/education/awseducate/
Open the cloud 9 tab under services.

 

 

Setup is very fast- just know if others are going to be editing to, understand the IAM policies and what VPC settings you actually want.

 

Know this ideally a browser-based service; I have tried to come up with a reason a SSH connection would be better and didn’t get any where.

For one person, micro is fine.   Know these virtual “RAMs” and “CPUs” are generous….

 

 

 

 

The default network settings are set up for you.   This follows good practice for one person; more than that (or if you are perhaps a far-travelling person) note these settings.  They are always editable under the VPC and EC2 instance tabs.

 

 

That’s it!   Other use things to know:

This is a linux machine maintained by Amazon.   Packages you think should work and be up to date (arguably like any other linux machine I guess…)  may not be.  Check your basics like the NPM installer and versions of what your going to be working on, it very well may be different than what you are used to.

In the editor:

You have two panels of workspace in the middle- shown is node and HTML.   Everything is managed by tabs- all windows can have as much stuff as you want this way.

Below there is a “runner” (shown with all the default options!) and a terminal window.  Off to the left is a generic file manager.

 

 

I hope this is useful, it sure is great for me.

-Jess

Using ESRI ArcGIS / ArcMap in the AWS Cloud

Selling AWS to… myself   🙂

Why struggle with underpowered local machines and VMs or watered-down web platforms for heavy lifting,  learning and work?

In addition to using ESRI software on mac computers, I am a big fan of the AWS WorkSpaces service (in addition to all their other developer tools, some of which are map-relevant: RDS for SQL and EC2 Redhat servers  for data management for example ).

Basically, for between ~$20 and ~$60 a month (Max, and not factoring in EDU discounts!), a user gets to use a well-oiled remote desktop.   You can download and license desktop apps like ArcMap and GIS products, file managers, and more from any computer connected to the internet.  This service is not very savvy; you make/receive a password and log right in.

A  big plus here of course is the Workspaces Application Manager (WAM); small sets of licenses can be administered in the same way desktops would, with extra easiness due to the “they are already really the same cloud thing any way”.

Another plus is any client- netbook, macbook, VM, etc- will work equally well.  In this regard it can be a very cheap way to get big data work done on otherwise insufficient machines.  Local storage and file systems work well with the client application, with the caveat being network speed.

🙂

https://docs.aws.amazon.com/workspaces/latest/adminguide/amazon-workspaces.html

 

 

 

 

 

 

Using ESRI ArcGIS / ArcMap on Mac OSX: 2 methods

Edit 07/26/2020:
Check out the expanded GIS notes page here!

Using ESRI ArcGIS / ArcMap on Macs: 2 methods

I need to run ESRI products on my MacBook Pro.   QGIS is always the prefered solution- open source, excellent free plugins, works on mac natively- but in a college / research environment, the only option that supports other people and school machines is ESRI.  Despite the annoying bureaucracy and expense of the software, some things are faster (but not better!) in ESRI, like dealing with raster / multiband data.

First, you need a license.

I went about this two ways;

My first solution was to buy an ESRI Press textbook on amazon.  A 180 day trial for $50- when taken as a college course, this isn’t to bad.  🙂   The book is slow and recursive, but a 180 days to play with all the plugins and whistles allows for way deeper learning via the internet.   🙂

Do know there is a little-documented limit to the number of license transfers you may perform before getting either lock in or out of your software.  I hit this limit, as I was also figuring out my virtual machine situation, which would occasionally need a re-installation.

My current solution is “just buy a student license”.   $100 per year is less than any adobe situation- so really not that bad.  

Now you need a windows ISO.  

https://www.microsoft.com/en-us/software-download/windows10ISO

Follow that link for the window 10, 64 bit ISO.  YOU DO NOT NEED TO BUY WINDOWS.  It will sometimes complain about not having an  authentication, but in the months of using windows via VMs, never have I been prohibited to do… anything.  When prompted for a license when configuring your VM, click the button that says "I don’t have a license".  Done.

 

Option one:  VirtualBox VM on a thumbdrive

https://www.virtualbox.org/wiki/Downloads – download for the VM software

http://a.co/4FEYMNY, http://a.co/hanHYl1 Suitable USBs.  the VM will take up most of a 128gb flash drive- ~70 gb just for windows and all the stuff you’ll want from a PC.  Add ESRI software and allocated space for a cache (where your GIS project works!), bigger is better.   Format all drives in disk utility as ExFat!  this is important, any other file system either won’t fly or could wreak havoc (other FAT based ones may have too small file allocations!

I used two drives, a 128 and a 64- this is great because I can store all my work on the 64, so I can easily plug it into other (school) machines running windows ArcMap and keep going, without causing issues with the massive VM in the 128.  

Installation is straightforward, just install EVERYTHING on the usb drive and it will be fine.   🙂

Problems:   Stability.   Crashes, and python / some other script modules do not work well.  This is a problem.  ArcAdministrator gets confused about all kinds of things- FWIW, if you are googling to delete the FLEXnet folder to solve authentication file issues, move to option 2 🙂

Speed is down, but actually the ~same speed as our school "super" PCs- (though I happened to know they are essentially glorified "hybrid" VMs too!) .

Option two: OSX Bootcamp 

https://support.apple.com/boot-camp

https://support.apple.com/en-us/HT201468

This way, you will hit "option/alt" each time you restart/boot your computer to choose from win/osx.   This is easy to install, as it is mac and mac = easy.

Big Caveat:  it is much harder to install windows externally  (on a usb, etc) from bootcamp.  I didn’t succeed in my efforts, but there could be a way….   The thing is, it really wants to run everything like a normal intel based PC, with all installations in the usual place.  This is good for the mac performance, but terrible for the tiny SSD hard drives we get as mac users.  I have a 256gb SSD.  I have an average of < 15 gb wiggle room here, and use every cloud service in the book.

If you need to manage your cloud storage because of a itsy mac SSD, my solution is still ODrive.   https://www.odrive.com/

I use Amazon cloud mostly with odrive, but I use a personal/school OneDrives, Dropboxes, Google,  etc.  with only the occasional hiccup.   Also, all of the AWS tools are great and cheap- EC2, S3, Cloud 9, lambda, RDS…. Great way to do your work outside of your mac via the internet.

Result:

ArcMap and GIS stuff is blazing fast on my modest 2015 i5/8gb macbook pro.  Comparing a huge, mega ATX+ school computer to my mac on boot camp, I am running large raster filtering operations significantly quicker than other folks doing the same type of work.   That is GOOD.

🙂

-Jess

Birding Beyond Binos: 5 Bird apps vs. “the Guide”.

We all have a favorite bird, animal or plant guide.  Peterson is the best at drawing; Sibley takes the best pictures.  Kauffman ties it all together; National Geographic makes a solid reference and Audubon is great for fast looks.

While these books will always have a place on the shelf or table, the depth of content and portability of smartphone apps and trustworthy (e.g. reaserch related or associated with a big bird organization you recognize) websites truly foster the next level of ecological acuity.

[I will cover apps for iPhones and iPads- these are tools I have available and find to be indispensable.]  Like the shelf of guides they can replace out in the field, there is always room for another guide- and, generally speaking, cost significantly less than the least expensive print guide on your shelf.

  1. iBird PRO –

This app does it all: view photos, range maps, sounds, and similar birds, and search by band code, Latin/common name.  The sound recordings are pretty good and can be looped individually or as a species playlist (good for playback in research situations).  Similar bird songs are playable at the bottom of each species- great for learning and verifying nuances between similar songs.  The illustrations are “ok”- better than what I could do (obviously) but nothing quite like Peterson or Kauffman.  There are two more (add-on) engines in this app I have not used:  the local birds function by GPS (BAM) and a “humanized” search tool to pinpoint the bird you are looking for (Percevia).

  1. Audubon Birds

Audubon Birds has come a long way, and generally will offer more of a comprehensive written overview on each bird- going into feeding, behavior, breeding, and habitat discussions.  They seem to have added eBird integration (far, far superior to their “nature share” tool) which allows for both a mobile search into the unfathomably large user-based data set for local birds and a way to add your own data to eBird (though traditionally, the best way to do that is from a computer).

  1. Audubon Owls

This app is only a small vignette on owls; there seems to be more info geared solely about owls here than in Audubon Birds- photos, videos, tips, and tricks

  1. Merlin Bird ID

Despite the hardcore Bird Photo ID algorithms and location-based searches, this is geared toward those who may be starting out, and want to up the ante.  You fill in a few parameters about a bird sighting (this will not help with bird sounds), then it will generate a list of probable birds.  Supposedly, if you get a good photo of the bird on your phone (Digi scoping/Wi-Fi upload?) It can id the bird visually.

  1. eBird

If you are truly doing an eBird list for your trip, try this app for basic, quick additions- but I would not rely on it for media uploads or anything too crazy.  You can upload your checklist from the field then edit it later, though it is unclear if that is really a good idea in the scheme of data collection.

This is a list of the Bird apps I use on my phone, most getting use many times a week or even every day (iBird Pro).

-Jess

Boutique everything: When The Hobby Grows Up

 

Food.  Clothes.  Art.  Musical Equipment.  Consumer Design and Products.  Can a mere citizen enter the fray of cutting edge design and production?

As a hobbyist designer with a passion for, say, high end audio, the options for actually producing a quality, well executed product may seem lucrative and completely not worth while.  “It’s just a hobby” some say, or, “The cost of manufacturing tools or a bid at the factory floor in China are way bigger than my love for sound”, or, “nobody would ever purchase my design, there are so many other companies who have done this longer than me”.  These answers are all valid, but may not be the complete picture when it comes to local, boutique production.    

Can a passionate enthusiast use makerspace technology and peer support to bring small batches/limited runs of high quality products to a localized, niche market?

Could a food connoisseur use networking services to construct a timely supply chain for seasonal meals at local restaurants or cafes?

Would a local tailor be able to source materials and equipment to realise the material science and design they have always dreamed of for a coat in small batches?

Using cutting-edge makerspaces and the subsequent networking opportunities, I believe producing small batches of high quality goods and utilizing a local business/niche marketing approach or distribution system could increase the innovation and quality of any given local economy.

The idea of “group buys” is elementary in DIY audio circles.  Folks going in on a board design for fabrication will often drum up some enthusiasm on the internet or elsewhere, in a move to offset the high entry price of board manufacture.  I have noticed some folks take it a step further, and will not only complete the project they intended to, but perfect the project into a product and do a run of a few pieces to a few dozen and beyond.  This model is actually a great asset to the developing maker; offsetting the cost (or even making a few coins in profit!) of larger projects inherently makes bigger and better projects feasible.   

The folks building audio equipment in their basement, garage, or bedroom are, in essence, artists exploring art through avenues otherwise devoid of artisan qualities.  It is easy to reproduce sound commercially- Apple supplies those iBud-earPod-headBeats with every phone they sell.  Yet, the people in DIY audio are taking on audio components exactly how a great potter would craft a new bowl or coffee cup; functional sculpture, art in one of its oldest forms.

Screenshot of Jazzman’s blog (http://jazzman-esl-page.blogspot.com/)

Below is a picture of one of the quasi-famous Jazzman ESL panels.  A true labor of love and work of art, Charlie has pioneered the processes required to build a ultra-top-end electrostatic loudspeaker, in the confines of the home, job, and hobby budget.  Now, Jazzman’s speakers are built almost exclusively by hand, using careful measurement techniques to ensure tight tolerances instead of using machines that could do this automatically- making these panels really one of a kind and certainly not an option for even the most ambitious cottage industry entrepreneurs.

I bring these panels up simply to show what home-brew audio (or any labor-of-love-hobby) is about: craftsmanship, dedication, and a desire to learn.  falling right in with home-brew beer, local pottery, cooking, painting, tailoring,  and more, one can see from this artisanal point of view the value in these kinds of work.

Unlike some of these art forms found exclusively in art shows and galleries, only recently has there been an opportunity for individuals to reverse the commercialization of otherwise beautiful hobbies.

 Commercialization and hobbies: can we have both?

You bet.  As individuals get better at their craft and further down the hobbyist rabbit hole, (I personally) wonder where to draw the line as a hobby.  Don’t!  We develop makerspaces to propel creation into hyperdrive; the next and last step in completing the artist’s high-end project circle is selling the last project so the new batch can be justified.  Because rapid fabrication and makerspaces are “a thing” now, people need to understand what comes next with all those creative and production juices flowing.  I think many makers may not approach their custom brazed bikes,  amazing wooden trinkets, or tube guitar amps from the view a painter would monetize paintings- but they (we) should.  Art stores, art shows, audio meetups, DIY ecommerce sites, Etsy, craft conventions…  These are real venues we should be adding to our vocabulary as makers.  It is the last step to a full circle justification, and for me (in my hobby bird photography work for sure) it simply feels amazing to be at that stage of chatting it up with locals about where I took the picture of the merganser.  It takes way more effort than I or my fellow artists will let on, (learning high-end home printing, commerce, getting a materials supplier, website, etc) and marketing/selling is not NEARLY as glamorous as hacking away at our craft.

But, at the end of the day, this is the right thing to do.  Showing others through commerce the true value of maker craft not only educates and enriches, but increases the value in our local economies and local-maker-wizardry.

« Older posts Newer posts »

© 2024 Trans Scend Survival

α wιρ Σ ♥ by Jess SullivanUp ↑