Tuesday, September 19, 2017

OpenWRT - Making Wired DLNA Server Visible to WiFi Streaming Device

Overview

This article describes how to solve the issue described by these circumstances:
  • You have a DLNA media server connected to the "Wired"/LAN side of a router running OpenWRT.
  • You have a media-player / streaming-client (like a Roku) connected to the WiFi side of the same router.
  • Your player can't find the media server.
  • You've already tried disabling multicast_snooping and that didn't help.
Note: For me, this applies to OpenWRT Chaos Calmer, running on a TP-Link Archer C7, but it probably affects others as well.

What's Really Wrong

That heading is a little misleading.  Part of what's wrong is actually the multicast_snooping thing.  So you still need to do that part, per: https://wiki.openwrt.org/doc/recipes/dumbap#multicast_forwarding
  •  echo "0" > /sys/devices/virtual/net/br-lan/bridge/multicast_snooping
  • Also add that same command to /etc/local.rc so it survives a reboot.
According to the docs on the OpenWRT site and several other places, that's all you should need to do, but the OTHER part of what's wrong, that isn't really mentioned anywhere, is that iptables is probably dropping the multicast packets before they can even be "snooped."

Steps to Get the "Other" Part Fixed

  1. Install support for iptables/firewall rules based on packet-type. (from an ssh prompt)
    • opkg install iptables-mod-extra
    • See: https://wiki.openwrt.org/doc/howto/netfilter#opkg_netfilter_packages
  2. Add a custom rule to your firewall configuration.
    • In the Luci web interface, that's under the Network->Firewall menu, in the "Custom Rules" section, or from an ssh prompt, edit (e.g. open with vi or vim) /etc/firewall.user
    • The rule:
      • iptables --insert forwarding_rule -m comment --comment "allow multicast from wired to wireless interfaces" -m pkttype --pkt-type multicast -j ACCEPT
  3. Restart the firewall (from an ssh prompt)
    • /etc/init.d/firewall restart
    • Assure that there are no errors like "Couldn't load match `pkttype'"

Summary

After disabling multicast_snooping and adding the firewall rule to allow multicast packets to pass from anywhere to anywhere else, the DLNA server, connected via ethernet/wired should show up immediately on streaming devices connected to WiFi.

Update

Even with the firewall open for all multicast packets, this was still flaky and intermittent.  Then it occurred to me that all of my wired devices were hooked into a Netgear GS116Ev2 - 16-Port Gigabit ProSAFE Plus Switch.  That switch ALSO had an IGMP/MultiCast Snooping feature, and it was ALSO enabled by default.  After turning that off AND disabling multicast_snooping in OpenWRT, the DLNA media server pops right up in the Roku player every time.

References

* https://forum.openwrt.org/viewtopic.php?pid=198895#p198895
* https://wiki.openwrt.org/doc/recipes/dumbap#multicast_forwarding
* https://dev.openwrt.org/ticket/13042
* https://www.garyhawkins.me.uk/dlna-upnp-and-multicast-routing/


Compatibility-Based JSON Schema Versioning

Overview

In a corporate environment, the task of centralizing the "enterprise" data model has had its challenges.  Communicating the definition of what a data object looks like has been rather inflexible with some popular technologies like XML-Schema or awkwardly mismatched with the needs of end-applications using relational databases.  JSON document encoding has become popular for transporting and storing application data but it is often prone to problems because the common methods of defining what should be expected in a JSON document (structure, field names, etc.) are somewhat haphazard and weak.  JSONSchema goes a long way towards satisfying the need to explicitly define JSON content, but it is still a challenge to implement a process that provides a useful data-document definition that can support meaningful data-validation, while retaining JSON's agile, freely-changeable roots.  This describes a possible approach to getting the best of both worlds, by implementing processes around JSON Schemas that can achieve flexibility and clearly defined data-documents at the same time.

Definitions

Since there is some confusion about what means what in the world of JSON data, let's get a few terms clear up front.
"Consumer-View" JSON Schema - an artifact, meant to be published for use by consumers of the corresponding data (i.e. application developers), that describes what can be expected in a JSON document that complies with the schema.  Unlike a database or XML schema, there isn't an expectation that this FULLY describes the document, just that the document should match what actually is defined in the schema.
"Producer-View" JSON Schema - a schema artifact, meant to be used internally (i.e. not published for consumers), that exactly defines every detail of a concrete JSON Document.
JSON Document - a "data document" encoded in JSON, that, if it is advertised as compliant with a particular JSON Schema, should at least include data matching the field names and structures defined in that JSON Schema.

The Problem

The "desired" definition of a data document changes over time.  Attribute names change. Data types might be altered.  New stuff is included.  Old stuff disappears.  The organizational structure of the data gets deeper or flatter.  Also, if multiple projects require  different changes to a data document they use in common, at the same time, it becomes VERY difficult to manage release timing and cross-compatibility.  If there must be only one JSON Schema that defines what an actual JSON document looks like, that one JSON Schema will end up having impossible constraints in order to meet everyone's needs.

The Typical, Rigid, "One-Schema" Approach

A common strategy for defining a JSON Document is to lock it together with one and only one JSON Schema.  In other words, this demands that everything that is defined in the schema, must be represented exactly that way in the document.  Nothing more.  Nothing less.  This comes with all sorts of concerns and frustrations about when something can be added, and/or whether anything can ever be renamed or removed.  If an old application were written using a previous version of the JSON Schema, changing anything besides adding more fields either breaks that old application or requires it to be updated.  This also implies the need to keep all instances of the JSON Document in perfect synch with the JSON Schema that defines the document.

The Proposed Solution

Freely change, or "version," the "consumer-perspective" JSON Schema as often as necessary, and in ways that would not be permitted if it were rigidly mapped one-to-one with a JSON Document, retain all previous versions of the JSON Schema in a published catalog, and include, in each JSON Document, a list of which versions of the JSON Schema it still supports.  Then, separately, if desired, create a "producer-view JSON Schema" to rigidly define an actual JSON document, and "version" that separately.

Detailed Example (Bookstore Theme)


1st Published JSON Schema Version - Everything Starts Out One-to-One

JSON Schema - One field named bookTitle

{
    "title": "Book",
    "type": "object",
    "version": "X",
    "properties": {
        "bookTitle": {
            "type": "string"
        }
    }
}

JSON Document - Complies with one version (X) of JSON Schema

{
    "jsonSchemaVersions": ["X"],
    "bookTitle": "Hitchhiker's Guide to the Galaxy"
}


2nd Published JSON Schema Version - Add Field - Nothing Complicated Yet

JSON Schema - One new field named isbn

{
    "title": "Book",
    "type": "object",
    "version": "ProjectISBN",
    "properties": {
        "bookTitle": {
            "type": "string"
        },
        "isbn": {
            "type": "string"
        }
    }
}

JSON Document - Complies with both published versions of JSON Schema

{
    "jsonSchemaVersions": ["X", "ProjectISBN"],
    "bookTitle": "Hitchhiker's Guide to the Galaxy",
    "isbn": "0345391802"
}

  • Note: The JSON Document still complies with JSON Schema version "X", because all it requires it the "bookTitle" field... and it's still in the document, still has the same name, etc.

3rd Published JSON Schema Version 3 - Oops, ISBN wasn't Quite Right

JSON Schema - Replace "isbn" with Separate Fields for ISBN-10 and ISBN-13

{
    "title": "Book",
    "type": "object",
    "version": "ISBN-FIX",
    "properties": {
        "bookTitle": {
            "type": "string"
        },
        "isbn10": {
            "type": "string"
        },
        "isbn13": {
            "type": "string"
        }
    }
}

JSON Document - Duplicates Some Data to Remain Compliant with both "ProjectISBN" and ISBN-FIX JSON Schema Versions (... for now).

{
    "jsonSchemaVersions": ["X", "ProjectISBN", "ISBN-FIX"],
    "bookTitle": "Hitchhiker's Guide to the Galaxy",
    "isbn": "0345391802",
    "isbn10": "0345391802",
    "isbn13": "978-0345391803"
}


  • Note: This document has everything it needs for all JSON Schemas published so far.  However, any application that is using the "isbn" field just got notified that it may not be around forever.
  • Note: This illustrates a little more clearly how the JSON Document can satisfy the requirements of previous JSON Schema versions, without the "latest" JSON Schema rigidly defining everything in the document.  This JSON Schema does not define old field "isbn" but the document still has it in order to still support the "ProjectISBN" version of the "Book" Schema.


4th and 5th Published JSON Schema Versions - Concurrent Projects

JSON Schema - Add Fields to Support Selling Books

{
    "title": "Book",
    "type": "object",
    "version": "ProjectSellBooks",
    "properties": {
        "bookTitle": {
            "type": "string"
        },
        "isbn10": {
            "type": "string"
        },
        "isbn13": {
            "type": "string"
        },
        "cost": {
            "type": "number"
        },
        "price": {
            "type": "number"
        }
    }
}

Another JSON Schema Published Independently, at the Same Time - Add Fields to Support Inventory Management

{
    "title": "Book",
    "type": "object",
    "version": "ProjectInventory",
    "properties": {
        "bookTitle": {
            "type": "string"
        },
        "isbn10": {
            "type": "string"
        },
        "isbn13": {
            "type": "string"
        },
        "countOnHand": {
            "type": "integer"
        },
        "countOnOrder": {
            "type": "integer"
        }
    }
}

JSON Document - Adds Support for BOTH Projects, Independently - Also Drops ProjectISBN Compliance ("isbn" field is gone now)

{
    "jsonSchemaVersions": ["X", "ISBN-FIX", "ProjectSellBooks", "ProjectInventory"],
    "bookTitle": "Hitchhiker's Guide to the Galaxy",
    "isbn10": "0345391802",
    "isbn13": "978-0345391803",
    "cost": 5.05,
    "price": 7.99,
    "countOnHand": 20,
    "countOnOrder": 10
}


  • Note: This document still has everything it needs for most previously published JSON Schema versions as well as both of the two new ones.  Notice that the two new JSON Schemas do not need to include each other's added fields.  The independent schema changes ONLY affect the document.
  • Note: The jsonSchemaVersions list no longer has "ProjectISBN" because the document no longer supports everything the "ProjectISBN" schema included (i.e. the "isbn" field).  The app developers were warned this was coming!!



Latest Published JSON Schema Version - Single New Project Additions + Cleanup

JSON Schema - Pull Multiple Previous JSON Schemas Together, and Add a few Things

{
    "title": "Book",
    "type": "object",
    "version": "Book2.0",
    "properties": {
        "bookTitle": {
            "type": "string"
        },
        "isbn10": {
            "type": "string"
        },
        "isbn13": {
            "type": "string"
        },
        "cost": {
            "type": "number"
        },
        "price": {
            "type": "number"
        },
        "countOnHand": {
            "type": "integer"
        },
        "countOnOrder": {
            "type": "integer"
        },
        "coverImageLink": {
            "type": "string"
        },
        "synopsis": {
            "type": "string"
        },
        "author": {
            "type": "string"
        }
    }
}

JSON Document - Everything that was Published Before, and then Some...

{
    "jsonSchemaVersions": ["X", "ISBN-FIX", "ProjectSellBooks", "ProjectInventory", "Book2.0"],
    "bookTitle": "Hitchhiker's Guide to the Galaxy",
    "isbn10": "0345391802",
    "isbn13": "978-0345391803",
    "cost": 5.05,
    "price": 7.99,
    "countOnHand": 20,
    "countOnOrder": 10,
    "coverImageLink": "http://mybookstore.example.com/images/covers/img0345391802.jpg",
    "synopsis": "The answer to life, the universe, and everything, is 42.",
    "author": "Douglas Adams"
}

  • Note: This document still identifies all of the previously published JSON Schema versions it supports, and any application that was coded against any one of those listed should still find the fields it knows about, right where they should be.

Producer-View JSON Schema

One of the main things that seems to aggravate the process of modeling data-documents that are shared by multiple consumers is the lack of separation between the "consumer-view" of the data and the "producer-view" of the data.  Back up in the "definitions" section, there are two different JSON Schema artifacts defined.  The example doesn't say much (or maybe anything at all) about the "Producer-View JSON Schema"  That's because the example focuses on the primary reason for defining JSON Documents, which is the application-end / consumer-view.

Part of this proposed solution is to stop trying to combine them.  Each of the JSON Document examples above, except for the very first one, didn't match up exactly to the entire set of JSON Schema documents.  In some cases like the multiple, independent changes for concurrent project. The actual JSON document wouldn't have exactly matched any single JSON Schema.  This fact exposes the need for an internal-use-only "super-schema", or "producer-view JSON Schema" that exactly defines the content of a document that satisfies the requirements of all of its supported "consumer-view JSON Schemas."  While it isn't strictly necessary to create this Schema document, having it would help to communicate with the "back-office" developers who need to know what the actual super-set document needs to have in it.

Summary

This approach to JSON data modeling resolves a few perplexing challenges.  It sets aside the need to keep a single data-document definition (schema) in lock-step with the document "instance" that satisfies the requirements of that definition.  It also identifies the opportunity to proceed with the concerns of a data-consumer separated from the concerns of a data-producer.  Finally, it alleviates the "backwards compatibility" burden by reducing it to just "compatibility" with published versions of a JSON Schema, never mind whether they were published before, after, or at the same time as any other JSON Schema with which a JSON document may also be compatible.






























Monday, June 19, 2017

Connecting to Cassandra Cluster via SSH Tunnels with the DataStax Java Client/Driver

Introduction

This is probably a little obscure, but if you have only one choice for connecting into a remote environment, like AWS, and that happens to be an SSH connection with tunnels to a "jump box", and you need to connect to a Cassandra cluster using the DataStax driver, I suspect that's why you found this, so read on.

The problem is...

DataStax wrote their Java driver to use Netty instead of using the core network connection classes in a typical Java virtual machine.  Netty is written to use Java's NIO API.  NIO does not recognize the JVM-wide settings like socksProxyHost, so it always attempts to make a direct connection to whatever host/port the Java code says.

The other part of the problem is...

Connecting the DataStax client/driver to one node of a Cassandra cluster results in a handshake that retrieves network information for the other nodes in the cluster and tries to open additional connections.  If the primary connection is established via an SSH tunnel, the network information for the rest of the cluster nodes is likely to still be routable only within the remote environment.  That doesn't work even if you created additional SSH tunnels.

The solution (in a nutshell)...

Create tunnels for all of the cluster nodes, and register an instance of the DataStax AddressTranslater when the connection to Cassandra is opened.

The solution (details)...

The JSCH library makes it somewhat easy to open an SSH connection with tunnels.

Assuming tunnelDefinitions is a collection of simple TunnelDefinition POJOs to contain a set of attributes for a local- to-remote host/port mappings.
A three node cluster might have mappings with bindAddress:localPort:remoteHost:remotePort like:

  • 127.0.0.1:19042:cassandra-cluster-node1:9042
  • 127.0.0.1:29042:cassandra-cluster-node2:9042
  • 127.0.0.1:39042:cassandra-cluster-node3:9042
public void connect(String jumpUserName, String sshPrivateKeyFilePath, String jumpHost, int jumpPort) {
    this.jumpHost = jumpHost;
    this.jumpPort = jumpPort;
    jsch = new JSch();
    try {
        LOGGER.info("Using SSH PK identity file: " + sshPrivateKeyFilePath);
        // Point to the PK file for authentication        jsch.addIdentity(sshPrivateKeyFilePath);
        LOGGER.info("Opening SSH Session to Jumpbox: " + jumpHost + ":" + jumpPort + " with username " + jumpUserName);
        session=jsch.getSession(jumpUserName, jumpHost, jumpPort);
        Properties config = new java.util.Properties();
        config.put("StrictHostKeyChecking", "no");
        session.setConfig(config);
        session.connect();
        for (TunnelDefinition tunnelDefinition : tunnelDefinitions) {
            // Note: Each call to "set" is actually an "add".
            // Note: The bind addresses are typically localhost or 127.0.0.1.
            session.setPortForwardingL(tunnelDefinition.bindAddress, 
                tunnelDefinition.localPort, tunnelDefinition.remoteHost, 
                tunnelDefinition.remotePort);
        }
    } catch (JSchException e) {
        e.printStackTrace();
    }
}

Then, using the same tunnelDefinitions to implement DataStax AddressTranslater...
AddressTranslater customAddressTranslater = new AddressTranslater() {
    private SshTunnelHelper sshTunnelHelperRef = sshTunnelHelper;
    private Map<String, InetSocketAddress> translationMappings = new HashMap<>();

    @Override    public InetSocketAddress translate(InetSocketAddress inetSocketAddress) {
        // Lazy Load        if (translationMappings.isEmpty()) {
            for (SshTunnelHelper.TunnelDefinition tunnelDefinition : sshTunnelHelper.getTunnelDefinitions()) {
                InetSocketAddress local = new InetSocketAddress(tunnelDefinition.bindAddress, tunnelDefinition.localPort);
                InetSocketAddress remote = new InetSocketAddress(tunnelDefinition.remoteHost, tunnelDefinition.remotePort);
                String mappingKey = remote.toString();
                LOGGER.info("Registering Cassandra Driver AddressTranslation mapping with key: '" + mappingKey + "'");
                translationMappings.put(mappingKey, local);
            }
        }
        // Note: The result of InetAddress.toString() has a leading "/"        String keyToMatch = inetSocketAddress.toString();
        LOGGER.info("Cassandra driver is attempting to establish a connection to: '" + keyToMatch + "'");
        InetSocketAddress matchingAddressTranslation = translationMappings.get(keyToMatch);
        if (matchingAddressTranslation != null) {
            LOGGER.info("Matched address translation from config properties for: " + inetSocketAddress.getAddress().toString());
            return matchingAddressTranslation;
        } else {
            LOGGER.info("Retaining unmatched InetSocketAddress: " + inetSocketAddress.toString());
            return inetSocketAddress;
        }
    }
};

The connection to the Cassandra cluster can then be established with the AddressTranslater...
Note: Even if the Cluster object is built with an AddressTranslater, the initial contact point must be manually translated first:
InetSocketAddress initialContactPoint = new InetSocketAddress("cassandra-cluster-node1", 9042);
InetSocketAddress initialContactPointTranslated = addressTranslaterWrapper.translate(initialContactPoint);
LOGGER.debug("Initial contact point (translated): " + initialContactPointTranslated.toString());
Set<InetSocketAddress> initialContactPoints = new HashSet<>();
initialContactPoints.add(initialContactPointTranslated);
final Cluster cluster = Cluster.builder().withAddressTranslater(addressTranslaterWrapper).addContactPointsWithPorts(initialContactPoints).build();
final Session session = cluster.connect("mykeyspace");

Monday, March 6, 2017

Arduino OLED BitMap Animation

Summary

On occasion, I bump up against a little tech challenge that just ticks me off enough that I won't let go until I have defeated it.  While making a special purpose remote-control for a camera aimer, I wanted to use a tiny, inexpensive OLED as a feedback indicator showing which direction the remote device was pointing.  I thought it would be simple enough to display a little bitmap depiction of the camera, rotated to correspond with the direction of the actual camera.  However, it wasn't that simple.

Challenges

  • Creating the initial bitmap was a bit tedious. (...to me anyway.  I suspect my only real solution for that would be more artistic talent.)
  • Converting the bitmap to C++ code required some web searching
    • Found option 1 (online): http://manytools.org/hacker-tools/image-to-byte-array/
    • Found option 2 (Windows): http://en.radzio.dxp.pl/bitmap_converter/
  • Displaying a rotated bitmap wasn't part of the library API for the OLED display
    • and it isn't trivial to just write a rotation function
      • https://forum.arduino.cc/index.php?topic=420182.0
    • and it isn't really quick enough
      • see post #12 of the previous forum thread.
    • and I doubt the result would have looked very good anyway.
  • Each 64x64 bitmap requires about 1/2 KB of the limited 32 KB program memory on an Arduino (ouch).
    • so I realized I'd have to compromise and only include a bitmap for each 10 degree increment, using a total of about 18 KB (36 images @ 0.5 KB each).
      • as it turns out, that's probably good enough, but it's still a trade-off.  I would have preferred a little more granularity.

Abandoned the First Attempt to Create all 36 Bitmaps

After deciding that using individual bitmaps encoded as a C++ char array was really the most practical option, I started doing the rotation task in Photoshop.  The process was promising to be very tedious.  I don't like tedious.  Even after transforming and saving each 10 degree rotation as a separate image, I would still need to upload every image file, one at a time, to the "image-to-byte-array" web site to convert it to C++ code.  The Photoshop processing could have been done with a recorded macro I guess but it was taking about 10 minutes to scale, rotate, color-reduce, and clean up extraneous bits.  I really didn't want to spend the next 5 hours doing the rest of the images this way, so I spent a few hours trying to find another way.

ImageMagick to the Rescue

After a short time, I remembered a command-line tool that I have found very handy for tasks like this in the past, ImageMagick.  While I was reading the ImageMagick docs, examples, and forum-posts explaining how to rotate an image, which, frankly, was all I had expected I'd get from the command line tool, I noticed that it was capable of doing a reasonably good job of interpolating the right pixels for a 2-color off-center rotation of the bitmap too (using the Scale Rotate and Translate / SRT function).  I was then really excited to find that ImageMagick could convert an image file to a C/C++ header file.  After a bit more web searching for various examples, I managed to boil the whole process down to 3 ImageMagick commands to produce a header file (C/C++ code) for each rotated image. 

The commands are (using a 10 degree rotation as an example):
  1. magick original_bitmap.png -antialias -interpolate Spline -virtual-pixel transparent -size 64x64 -distort SRT 10 rotated_10_deg_bitmap.png
  2. magick rotated_10_deg_bitmap.png -channel alpha -auto-level -threshold 50% two_color_10_deg_bitmap.png
  3. magick two_color_10deg_bitmap.png -define h:format=gray -depth 1 -size 64x64 -alpha extract bitmap_10deg.h
Using a Windows batch/cmd script (which was easier than writing a *nix shell script since I was on a Windows machine anyway), I could have a script quickly produce the full set of header files.  Using the "for /L" command and inserting variable references in a few key places, the script loops through the 10-degree increments and creates a C/C++ char array with hex- encoded (i.e.  0x0E, 0x00, etc.) data, representing each image.

All that was required to finish automating the process was to:
  • add a few lines for #ifndef, #define and #endif (to avoid build issues with multiple includes),
  • and use a Windows port of the "sed" command to customize the default variable declaration (static const unsigned char MagickImage[]) with a distinct name and extra keywords (PROGMEM).

Other Possibilities

Before moving on to the actual example script, it's worth noting that image rotation isn't the only way to use ImageMagick to "pre-formulate" bitmaps for an OLED (or other single color displays).  ImageMagick is capable of a multitude of other "distortions" to show movement or perceived effects like 3D flipping.   If rotating an image isn't exactly what you want, you may find your answer by reading through documentation pages like this one: http://www.imagemagick.org/Usage/distorts/

The final Windows command script is as follows:

@echo off
set MAGICK_CMD=c:\win32app\ImageMagick-7.0.5-Q16\magick.exe
set SED_CMD=c:\win32app\unixgnu\sed.exe
set HEADER_OUT_DIR=..\

for /L %%i in (0,10,350) DO (
    %MAGICK_CMD%
original_bitmap.png -antialias -interpolate Spline -virtual-pixel transparent -size 64x64 -distort SRT %%i rotated_%%i_deg_bitmap.png
    %MAGICK_CMD%
rotated_%%i_deg_bitmap.png -channel alpha -auto-level -threshold 50%%
two_color_%%i_deg_bitmap.png

     %MAGICK_CMD% two_color_%%i_deg_bitmap.png -define h:format=gray -depth 1 -size 64x64 -alpha extract bitmap_%%i_deg.h
    echo #ifndef ICON%%i > %HEADER_OUT_DIR%\bitmap_%%i_deg.h
    echo #define ICON%%i >> %HEADER_OUT_DIR%\bitmap_%%i_deg.h
    %SED_CMD% -e "s/char/char PROGMEM/g; s/MagickImage/bitmap_data_%%i/g"
bitmap_%%i_deg.h >> %HEADER_OUT_DIR%\bitmap_%%i_deg.h
    echo #endif >> %HEADER_OUT_DIR%\
bitmap_%%i_deg.h
)

Notes on Magick command options used:

Some of these explanations are not quite right.  This represents the best understanding I had time to obtain, so if any of it is a bit off, please leave a comment with a better explanation.
  • Converting from original PNG (saved "For Web and Devices" from PSD file in photoshop as 2-color PNG8) to rotated PNG
    • -antialias produces an image that has fuzzy edges that are a better approximation of what the rotated image should look like
    • -interpolate Spline gives the best results for translating the lines and spots in the original image
    • -virtual-pixel transparent fills in the alpha-channel transparency for pixels that are set on an edge (instead of the pixel's color)
    • -size 64x64 saves dimensional info in the output image so the next step doesn't whine about %h and %w being missing
    • -distort SRT is the number of degrees to "scale rotate translate" which basically accomplishes an in-place rotation without clipping
  • Converting from rotated PNG to the BW (black an white) PNG
    • -channel alpha tells ImageMagick to use the alpha channel instead of one of the color channels to pick the output pixels
      This is necessary because the rotated image is essentially a gray-scale image with a transparent background
    • -threshold 50% yields a good final pixel on/off choice, based on the transparency/alpha values.
  • Converting from the BW PNG to the C/C++ header file
    • -define h:format=gray tells ImageMagick to output to just image bit data bytes without GIF or PNG header info included
    • -depth 1 constrains the output to 1 bit per pixel as required for the OLED (each pixel is either on or off)
    • -size 64x64 (may not be required  TODO: experiment)
    • -alpha extract tells ImageMagick to use only the alpha channel info in the PNG instead of every color channel.

Conclusion

What would have been a lot of tedious work creating derivative images, with Photoshop (or a similar image editor) and various other online/GUI based tools, was accomplished with a bit of scripting and a spectacularly useful (and free) command line tool.  Hope this comes in handy for something you're working on.  Please leave a comment and let me know if you found it useful.



Monday, November 21, 2016

De-Annoying Amazon Smile with a UserScript

There are things on many web sites that are annoying, but if I just have to use them once in a while, I shrug and keep going.  However, when there is something annoying on a web site like Amazon.com, that I have to put up with all the time, I sometimes hit my frustration limit and try to fix it.  A while back, I got tired of the search category on Amazon.com switching to the category of the currently viewed product.  That's pretty much NEVER what I want.  More recently, I had my fill of the Amazon Smile popup just below the search box grabbing the focus to tell me what I already know... I'm supporting a charity with BLA BLA BLA.... shut up already.

The easiest way to de-annoy a web site, if you know a little html and Javascript is to install a user-script browser plugin (like GreaseMonkey, TamperMonkey, or Scriptish), and add a user-script that will fix the web site's issues after it loads.  That's how I managed to fix Amazon.com (and smile.amazon.com).

It was easy to fix the first issue by just setting the selection on the search category drop-down box back to index zero ("All") every time any amazon.com page loads.  In the user-script, that just looks like this:

document.getElementById("searchDropdownBox").selectedIndex=0;

The second issue was a little trickier, but still not too tough.   The issue may be unique to me and my habits, but I have to believe there are others who are aggravated by this usability failure.  If you click in the search box, and then get the mouse cursor out of the way by moving it down a bit, you end up triggering a hover-pop-up for your Amazon Smile charity.  That would be fine, except it takes focus away from the search text box and forces you to move the mouse cursor somewhere else, letting the pop-up go away, before you can type anything.  There were a few things I would have bought at Amazon.com but that stupid pop-up just ticked me off and I decided they could do without my business those times.

So, to fix the moronic hover-pop-up, I found the page elements that trigger the pop-up, and switched their display style to hidden (which means the hover isn't triggered when the mouse cursor is over them).  The following code goes in a user-script:

    document.getElementById("nav-pldn-msg-wrapper").style.visibility = "hidden";
    document.getElementById("nav-supra").style.visibility = "hidden";
    document.getElementById("nav-pldn-org-name").style.visibility = "hidden";
    document.getElementById("pldn-supporting-arrow").style.visibility = "hidden";
    document.getElementById("pldn-supporting-arrow").style.display = "none";


This is just an example of how to de-annoy a web site you might use frequently enough to spend a little time suppressing those annoying "features."  User-scripts are good for all kinds of things like this.  Leave a comment if this was helpful or if you've figured out how to de-annoy a frustrating web site you use.

The whole user script is shown below:

// ==UserScript==
// @id           FixAmazonAnnoyances
// @name         FixAmazonAnnoyances
// @namespace    http://deannoy.amazon.com/fixamazonannoyances
// @version      1.0
// @description  Fix some of Amazon's annoying page automation
// @author       Whirly
// @include      https://*.amazon.com/*
// @run-at       document-end
// ==/UserScript==
(function() {
    'use strict';
    // Pop the search back to "all" automatically to prevent the next search
    // from being constrained to the department of whatever item might be displayed.
    document.getElementById("searchDropdownBox").selectedIndex=0;
   
    // hide the amazon smile hover-popup stuff that takes the focus away from
    // the search box if the mouse is moved a bit after clicking in the search box
    document.getElementById("nav-pldn-msg-wrapper").style.visibility = "hidden";
    document.getElementById("nav-supra").style.visibility = "hidden";
    document.getElementById("nav-pldn-org-name").style.visibility = "hidden";
    document.getElementById("pldn-supporting-arrow").style.visibility = "hidden";
    document.getElementById("pldn-supporting-arrow").style.display = "none";
})();




Saturday, October 22, 2016

Fright Props Park Motor Mod

Fright Props Motor

There's a company called FrightProps that sells a really awesome geared electric motor for a reasonable price.  They primarily sell them for the purpose of haunted-house prop animation, like spinning the head of a robot-zombie or vampire around while its devil-red eyes flash and shrieking noises emanate from a little speaker it its left arm... but I digress.

Re-purposed

I had a slightly different purpose in mind for which these motors are also well suited.  I'm putting them in Jigging machines for ice fishing.

PicoVolt

I'll have another blog post with WAY more details on that.  Anyway, FrightProps also sells a nifty little controller box (called a PicoVolt) that works very well for running the motor through a short sequence of recorded motions (e.g. forward a bit, backwards a bit, around and around for a little while), but the motions are not very precise and getting the recording done is not unlike putting a greeting on voicemail... "Hi, you've reached..." !@#$%.  "Hello, I can't take you call..." grrrr... no.  "Hi, Leave me a message and I'll try to..." (sheesh, my voice sounds so goofy)... whatever... "Leave a message.  BEEP".  Back to the re-purposing, the PicoVolt has its place, but it isn't perfect for what I'm doing.

Park Switch

Inside the gear box of the motor is a "park switch."  The motor resembles a windshield wiper motor and I suspect the "park switch" is the same kind of feature that allows wiper blades on some cars to find the spot where they're tucked out of the way, and stop there.

Proprietary Pairing

In order for the FrightProps park motor to work with their PicoVolt controller the way they've designed it, the motor comes with the red wire (one of the motor coil voltage wires) also connected to the "park switch".  When the motor is in the "park" position or as it is passing through that position, since there is voltage applied to the motor coil, that voltage also flows to the "yellow" (common) wire of the park switch.  It will be positive or negative, depending upon which direction the motor is turning.  That appears to be part of how the PicoVolt senses that the motor has reached the "park" position.  In fact they may have designed it so that the motor has to be running forward (red-wire-positive) for the park function to work at all.

Arduino Uses 5V TTL Logic Voltage

My slightly different purpose for the FrightProps motor also has a slightly different requirement for the park-position switch.  I'm operating the motor using an Arduino micro-controller and a "high-amp/high-voltage motor shield" so I can precisely program (using Arduino/c++ code) the sequence of motions.  I want to be able to sense when the motor is in its "park" position, but I didn't want to try figuring out whether the motor was currently running forward or reverse, and I didn't want to add extra electronics components (resistors, zener diodes, capacitors, etc) to convert and isolate the motor voltage (sometimes as much as 13V) to a clean "logic level" +5V signal.

Motor Mod Time

The simple solution was to modify the FrightProps motor a little from the way it is wired when they ship it.  By disconnecting the red wire from the "park switch" and adding a separate wire (with white insulation in my case), I could keep the entire park switch circuit separate from the motor voltage and wire it directly to Arduino input pins.  The diagram here shows approximately how the switch works and includes notes about where I made the changes.


Also, here are a few photos showing the modifications on the actual motor.

Remove the shrink wrap where the red wire is soldered to the park position switch.
Heat the solder and disconnect the red wire from the park switch, leaving it connected only to the motor coil.
Solder a new wire to the park switch and cover both the bare spot in the red wire, and the solder joint on the new wire, with shrink wrap.

Monday, September 19, 2016

PVC Console for Hobie PA-17T

The Hobie PA-17T has a huge head start for rigging with the "h-rails" down both sides, floor hatches, vantage seats, mesh rubber side pockets, and rod holder holes on either end.  For the first few trips, I thought that would about cover it, but I put a cooler in the middle and got really annoyed having to remove and replace bungee cords dozens of times a day so it wouldn't slide around and get in the way.  As with most kayaks I've ever had, the easiest and least expensive remedy for shortcomings in the stock rigging always seems to lead back to PVC pipes and elbows.  The Hobie PA-17T's lack of a good place to put a cooler was no exception.  So, in case someone else is looking for inspiration, here's what I did...

The rail around the cooler and the posts that go into the scupper holes are made from 3/4" PVC.


The rod holders on either side are made with 1-1/4" PVC crosses, pipe and caps.  The bungee cords are retained by threading them across a short cross-cut piece of 1-1/4" PVC.


The 3/4" PVC is adapted to the 1-1/4" cross with a step-down bushing.


The 3/4" pipes extend a few inches into the scupper holes which combines with a bungee on either side to keep the whole thing very secure.

Parts List

  • (4) 1.25" PVC Cross
  • (2) 0.75" PVC Cross
  • (4) 1.25" PVC External Cap
  • (4) 0.75" PVC 90deg Elbow
  • (4) 1.25" to 0.75" Bushing
  • (1) 0.75" 10' Sched 40 PVC Pipe
  • (1) 1.25" 10' Sched 40 PVC Pipe
  • (1 Small Can) General Purpose PVC Cement 

SketchUp PVC Drawings