Time to Define Tables

I’ve worked with databases on and off for a long time but in general I’ve been accessing previously defined data tables or had very simple needs (DynamoDB kind of forces that).

I’m at the point in my sandbox work that I’m about to define a substantially more complex schema than I’ve used for anything in the past. Finding that my command of SQL DDL and the various bits involved in creating MySQL users and setting permissions is taking some doing.

I still can’t seem to get a database on a remote system (boojum, myimg_20161229_133853 little test machine). I am not getting a ‘no connection’ response but an authorization failure so I expect that the identit(ies) I created on the NUC aren’t quite right Always details to be dealt with…

I am finding that tables proliferate. Everything needs a unique ID (and auto increment columns help enormously here). Any sort of one to many relationship (keywords, validation) winds up with a new table containing the source object key and the targets. In the past I’ve used the DB as an index for more complex storage (DICOM-3 pretty much forces this with its many 1..n fields, DynamoDB encourages this as well with bulk information in S3).

In the current case I really want to keep everything in the DB rather than spilling items overboard. If I get to thumbnails I’ll see how the BLOB types work. For now I expect that I’ll be ok with the 64K per row limit on MySQL InnoDB tables. I expect that BLOB and TEXT aren’t stored in their respective tables so they should permit ‘stretch’ operations.

Current working notes are a bit chaotic 🙂

DB Initial Sketch
Chaos in the Schema

I am sticking with the MariaDB fork of MySQL to avoid Oracle entanglements. Once the basic database definitions are in place I expect to code the front end in C# as I need to polish up that language and environment a bit at the moment. Not sure whether I’ll go with WPF or some sort of web interface for the UI yet. Got to get the innards a bit better defined first.

Looks like I was just messing up my command line for remote mysql access. Once I got things straightened out, I can talk to the database instance on boojum from chaos. Should make things a bit cleaner as I can keep the data in one spot and access it from any system in the house. Now I just have to finish with the create table statements and see if they build what I need. Short digression into users and permissions first (and probably a quick look at alter table) to see what I should be doing for remote access.

Nice case for my Raspberry Pi-3

Looking very similar (but smaller and much less expensive) to my NUC this case works very well for the R-Pi that I’ll likely img_20161226_140011 use to drive my RepRap if/when I get around to finishing it.

The one issue I have with this case is the limited access to the main board interfaces. It looks as if there is a dedicated ribbon cable slot to allow access to the main external interface.img_20161226_140927 The alternate interfaces are much less accessible as they’re on the top of the board and between the board and the case when fully assembled.

The other issue I have (though for my purposes it isn’t a big deal) is that the micro-SDcard access is limited. img_20161226_141443The card does not extend beyond the case fairing and while it can be extracted with a fingernail between the case edge and the card, I’d be reluctant to use it if i were expecting to change out cards regularly.

 

The bottom of the case is vented and there’s a thermal pad that could be used to sink the CPU head to the case. I have skipped this as I expect to be taking the board out now and again for access to the hardware pins.

img_20161226_140221

Third Edition Effective C# Just Arrived…

The new edition of Effective C# just arrived in the mail. The ‘effective’ series have been consistently helpful with advice beyond the basics of language syntax. As I’m currently spending significant amounts of quality time with C# for work and also for some sandbox coding at home, this is a welcome item.

 

img_20161221_202611Looking forward to seeing what is in this new edition and seeing what it has to say about version six of the language. I’ll try to write up comments on it when I’ve had time to read through this one.

Looks like I’ll need to upgrade my GIT-fu

Finding myself nuking and recloning alot. I’ve tended to use GIt just for local repo stuff (and rather focused work at Amazon). Now the GIT repo I’m interacting with is much richer…going to have to up my GIT game to manage things without too much messing about. Merge failures in large pulls get lost too easily…I expect there is a command that can point them out to me…got to find it…

Add in cloning from a specified branch (cloning from the default branch and then pulling the required branch isn’t a great idea).

 

I’ve got a new perspective on microstepping…

For some time I’ve been wondering about micro-stepping. It would seem that if you stop at a microstep position that you’re going to either be PWM-ing the windings or driving them in the linear region of the drive transistors. In either case I’d think the power dissipation in the drive devices would be substantially elevated. Add in the reduced holding torque from partially driving the relevant windings and it has seemed like a bad idea.

I just recently read an article that suggested that the main value from microstepping is smoother motion (and at least implied that you’d generally not want to stop unless you were at a full or half step location. That I can buy…you’ll not be spending extended amounts of time in the linear region of the drive transistors. You effectively approximate a smooth hand-off between windings and use gear ratio to manage increased resolution (of higher torque) as needed.

 

This may put a crimp in the Java shops out there…

According to the register Oracle has begun shaking down companies that use Java for some rather steep fees. It appears that they slipped bits of less than free code into the various packages they’re distributing and are now targeting companies that have managed to step on the click-through license terms for serious cash.

I can’t say I’m at all surprised…Larry’s folks have made if clear that they intended to monetize Java since they bought Sun. Given Oracle’s rather predatory and underhanded past exploits, this approach seems in line with what could have been expected. Continue reading This may put a crimp in the Java shops out there…

…and still lisp…

Pulling in some open source common lisp source trees. Not sure whether windows or linux is a better target but for now I’m mostly running windows so I’ll start there.

Quite a few options to choose from. I’ll probably poke at a few of them before I’m done. Currently seems to break out between:

  • Common Lisp running on various back-ends. Attractive because this seems to be the most ‘mainstream lisp’ style. Seems to be sort of a super-set of everything that has gone into the list world pretty much ever.
  • Clojure as a JVM based, functional programming oriented version of the language. Seems to be well supported and popular but I suspect that it has diverged significantly from its lisp origins. This was more attractive when I was working fro Amazon Robotics and everything I was doing professionally ran on the JVM. Still a nice language package…
  • Scheme appears to have been the first consistently lexically scoped lisp implementation. It seems to be in the process of forking into a slim, simple version channeling its roots as a teaching language and a bulkier but more capable language aiming more at the place the common lisps play.

I expect that these will keep me occupied through the end of the weekend (partly putting off diving deep into formally writing the DDL for my sandbox project’s schema) and perhaps even have one or two environments up and running.

One thing that I’m wondering about is the suggestion I’ve seen in several sites that common lisp code gets distributed as a ‘live’ workspace. I’ve become very used to the idea that there is a source code set that builds any particular version of the code being put together. I expect to have a bit of a struggle if lisp means adapting to a world where you can’t exactly rebuild your workspace from source without significant effort. More playing with these languages should tell me whether this impression is accurate.

Digressions and returns…Lisp, NUC and SQL.

I’ve mostly got the pieces of SQL DDL together that I need to define the tables for my sandbox project to manage file archiving. Hit this weekend after a tiring week and let myself get distracted.

I had bought an Intel NUC 6i5 to replace my ‘test target’ machine img_20161211_112015that has been randomly hanging and rebooting lately. The NUC setup and OS install went well except for the NIC driver. Wireless worked perfectly but the driver for the gigabit NIC either wouldn’t see the controller (Intel driver install packages) or saw the controller but then timed out before completing. As this was on a clean Windows install with nothing present except for the Intel driver img_20161211_112034packages I’m getting a replacement from Amazon. Should arrive today…hoping all goes smoothly as the NUCs are very nice little machines.

I let myself get distracted by some articles on Clojure and then wandered down into Scheme and Common Lisp. The various lisp dialects have always had a bit of allure to them as hugely expressive languages with very simple syntax. Nothing that I’m likely to every use professionally (though you never know) but cool toys to play with.

Clojure seems to be the closest to mainstream relevance with its JVM hosting and functional programming focus. Not sure I’ll do much more than poke at these but who knows.

Trying to get aimed back at DDL for the tables I need and then start piecing together C# code and native PInvoke stuff to get me where I need to go. Would be nice to be able to thumbnail canon raw files and PDFs (even better to get at metadata) but that will come later. Expecting that to involve serious native code execution as most of the SDKs for such things are in C or C++.

 

SQL and Sandbox Moving Forward

Spending some time improving my SQL with a focus on MySQL and the file archive management tool I’ve been framing up.

I looked at SQL Server Express but given I’m more familiar with MySQL and there are limitations on SQL Server Express that might eventually bite I’,m sticking with the open source tool. SQLite looked interesting as well but a full featured server engine seems like a better idea for this piece of work.

I’m currently trying to work through the initial schema for the database. Looking at tables and relationships. This is significantly more complex than SQL layouts I’ve designed from the ground up and I’m doing the whole job here. Should be interesting.

Once I have the schema sketched out and have figured out how best to define the tables and relations I’ll be moving on to get some coding in. I expect to build most of the machinery into one or more C# DLLs with the UI in a separate assembly (not sure currently but probably WPF initially). I’ll likely have a side-car native DLL to handle things like volume serial numbers that aren’t directly accessible from C#.