Skip Ribbon Commands
Skip to main content

Quick Launch

Stenoweb Home Page > Cory's Blog
February 06
Focus Writing

One of the biggest hardware, space, and workflow challenges I've had over the past several years has been how to focus on writing. I am not generally a believer in "distraction-free writing environments" especially when talking about using vintage computers that way. I am, however, a believer in other things that encourage good writing habits.

I'm a strong believer in the idea that there are things you can do to encourage you to write, but that you won't write unless you're ready. I've found space to be a good helper, but I've never figured out exactly what influences my productivity while writing.

I have never had trouble writing on any given computer. I don't need a computer to be set up in such a way as to be distraction free, nor do I need to do things like set up specific vintage computers to encourage writing. (Not to mention the perhaps somewhat obvious workflow limitations of using a vintage computer for writing, at least the writing I do.)

I have a theory that it's not the computer or even the place, but the way that place is set up. With a recent change to my living arrangement (short version: I have a housemate now) I'm forced to come up with creative places for things I use. For the first time since I've moved into the house, I have a computer in a formal location in my bedroom. Right now, it's just a little table with a chair at it, and the laptop floats back and forth between my recliner chair and the table, but I do find that writing at this little table is a bit more productive.

It's probably just because it's easier to force "distraction-free" if I only need to do it to one computer at a time. Maybe I'll see if I can use this setup to do that photo tagging I've been meaning to do for several years now. I wonder what it would look like if I set the desktop up in that space. I seem to have a lot more luck with writing and other productivity on laptop form factors. I don't know if it's just because I'm used to it, or if it's because my desktops almost always have a lot of other stuff nearby and it's hard to get into a particular flow when I have several computers competing for my attention.

It works well to have a bunch of computers at work, if only because the very nature of my work is that I am switching between tasks so fast that it doesn't make a difference if they're on one system or the next. The core of that work is being contacted for assistance. If I had a job where I had to focus on a single task or project, I'd probably reduce to a single machine, maybe even switch to an all-in-one and a single display.

I have a feeling that with a little more reason to do it and a few more tables overall, I would be that person whose desktop is painstakingly cleaned and has one computer set up in such a way that the space still looks minimalistic, either the tower would be hiding elsewhere or it would be an all-in-one computer.

And perhaps that's the secret I keep missing. I have had some luck at my tall desk which just has the fast PC at it, but maybe I should set up a similar sitting desk, or get a better chair for the tall desk.

One of the places I'm most productive is away from my home. I think there's a magical combination of only having one screen available, being away from certain resources such as your fast home Internet, your media library, and so on, and the feeling that your time spent in a public space such as a café or a restaurant must be spent well. I've also had good luck writing when I'm on battery, although I don't think that's a specific requirement.

It usually helps to have an idea of what I'm going to write. Whether it's a vague idea, a draft I am going to rewrite, or a specific outline.

I even find workflows where I need to do multi-tasking, such as bouncing ideas off an IRC channel, using a web browser for research and links, a text editor or OneNote for holding onto temporary bits of text, and then my word processor or other writing program for the actual text to work better when I'm on the go.

Much of this applies to tasks other than writing, the biggest problem is that writing is one of the easiest things to do on the go. Doing more intensive tasks on the go either requires that I can reliably work at home or that I have a bigger, faster mobile computer that holds more data and also reliably runs for a long time on its battery. More to come on that issue though.

January 30
Vintage Tiered Storage

I recently participated in a discussion where somebody essentially presumed somebody else was "stupid" because they (as a UNIX system administrator) weren't aware of a long-deprecated feature in a version of Windows that came out fifteen years ago.

There was a pretty lengthy discussion about the fact that stupid and ill-informed are different things, and that (especially for a professional UNIX admin) not knowing about a feature that was deprecated ten years ago seems irrelevant. I won't harp on it too much here, but suffice it to say that people with ideas like these grate on my sensibilities because it tends to be a symptom of a larger pattern of some negative language that is often both inaccurate and in extremely poor spirits.

The technical component was interesting though. This person has an old server (using SCSI disks) with probably about 700 gigabytes of usable space. This is a common configuration for a server in the late Windows 2000 and early Windows 2003 era. The person has an external SCSI card and a tape library with some number of LTO-3 drives. Some amount of the tapes installed are dedicated to backup, and the remainder are dedicated to a cold storage tier. There are 22 terabytes of addressable cold tier space in this person's system.

For those unfamiliar with the idea, Windows Remote Storage Manager (2003) allows the extension of capacity and the better utilization of resources by placing infrequently used files on tapes, which have traditionally been cheaper to scale than hard disks. Today's version of this is basically to use a few solid state disks for the most active part of your dataset, and then start scaling down to different types of disks. You might use small 10k disks for the next tier. Then 4TB/7200RPM disks for the next tier, and then 8TB SMR 5400RPM disks for the last tier. Today, tapes basically don't exist in active tiered data storage situations, because they're just too slow, and because expectations are different.

If you control the environment and you tell the accounting department that files they haven't touched in a year are going to take longer to load, then you know what to tell them when they call when they're trying to open documents from two years ago. You can't exactly put the oldest bits of somebody's Dropbox account or last year's Facebook pictures on tapes. Even in internal institutional environments, expectations are such that there's an expectation that years-old data will be easily at hand. I don't think that expectation is unreasonable given how much less expensive server hardware and bulk storage is today than it was even just a few years ago.

There were the normal lamentations about why Microsoft removed it from the system, but to me it seems pretty simple. Remote Storage Manager is a vestigial component of Windows that just didn't fit with the strategy or the needs of real customers, even in 2008. In 2008, SATA and relatively big disks and the ease of expanding SAS controllers meant that buying shelves of disks with monumental capacities was cheaper than dedicating another tape library (or a portion of your primary one) to a cold tier.

Although it has existed, I'm guessing the big difference is that for high availability reasons, Microsoft wanted to push customers to Distributed File System, which allows for much better high availability configurations, and makes more sense today than it would have in the '90s or between 2000 and 2003 anyway, because servers and disks use a lot less energy per terabyte than they did all those years ago.

As I mentioned, the person we were talking with has about 22TB worth of tapes, and 700GB worth of hot storage area. I don't know what their habits are generally like, but I know I need a lot more space than that to be active. I have a lot of data that should probably be sent to a robust-but-inexpensive cold storage location, but I'm not going to do that by adding a big tape library to my server.

It got me thinking… this person was very proud of their 22TB. I think they think they're in some kind of big leagues. The availability of inexpensive slow external and internal disks (things like 8TB SMR disks from Seagate) prompted a vehement reaction. You should never use USB storage. Important files and any amount of large amount of data should always be on a server!

It's an interesting and ultimately unreasonable expectation. There are many datasets that need to be stored locally. Lightroom libraries, as one example, can not be stored on a server. You can use synchronization tools to store that data on multiple desktops, however.

Back to servers though… storing large amounts of data is less expensive than it has ever been. 8TB disks, SMR and PMR, are inexpensive enough, you could put five of them in a Drobo 5C and have 24 or 32TB of fault tolerant storage on a server or desktop. You can put six 3.5-inch disks in most entry level servers today, so you're talking about 32 or 40TB of fault tolerant storage. (Side note: I'm considering a USB 3.0/3.1 card for TECT and a Drobo 5N or 5C as a backup system. Perhaps as a sort of "disk to disk to other disk" setup or as the tier you see before moving things to tapes or RDX cartridges manually.

I'm sure that in its day there was a lot of success and cost savings to be had by using a tape storage tier. Today, I just don't think it's reasonable. You can buy extremely dense disk enclosures from normal PC vendors these days, and for any enclosure attached to a controller that supports 4k sectors, you can easily swap out old disks for new ones to increase density and capacity on existing systems. You can often switch out disks one at a time and then grow into the new capacity using whatever software or RAID controller you're using.

In the light of the fact that it's possible to equip a single disk enclosure that will run over 600 terabytes of disks, I'm not sure it's necessary for storage tiering products using tape to exist. With disks using less energy than ever, density going up, disks costing less than tape below a few hundred terabytes, and with the ability to spin down disks, it doesn't make an awful lot of sense to bother with tapes. Especially since at "web scale" you need several copies of a piece of data anyway, meaning that tiered storage with tapes is going to need to work that much harder to work at the expected level.

I think it's neat that somebody has done it and documented it, but there are few enough situations where it's relevant that it probably didn't make sense to keep in the software. Using deduplication to reduce the overall load on disks in a server and using true archiving solutions to offload old data probably makes more sense than using tape storage in this context. I think that tape archiving and tape backups are good ideas, but tape file server doesn't make as much sense.

January 23
The Meta of Storage Choices

j

The most interesting update in the world of vintage Macintosh is probably the new Tiny SCSI Emulator. It sort of challenges the SCSI2SD as the performance crown for vintage Macs. It is also easier to hand-assemble, and it promises future functionality such as the ability to use SCSI for Ethernet and video output connections, which was functionality provided by real SCSI peripherals that have existed.

Naturally, there's a discussion going on about this, but much of the basis of the discussion appears to be based on how smug people can be. The original post was pretty bad even for Ye Olde Computer Forum, basically decrying the use of the SCSI2SD as being unmanly in a world where you, for some reason should be using old mechanical hard disks with SCSI based computers. Never mind the fact that it is essentially impossible to rebuild or meaningfully refresh mechanical hard disks, and that as the oldest 68k Macs are now over than 20 years old, finding machines with working disks is getting increasingly difficult.

A few years ago, a trend was to adapt fast, modern server hard disks for use in 68k Macs. The real problems with this are that the supply of those disks isn't unlimited, and that the Mac community isn't the only one searching for a modern type of disk or better storage solution. Various UNIX computers use those disks too, and those systems generally have a better capability to cool and use disks like this (and similarly, are very poorly served by traditionally low performance solutions such as the SCSI2SD.) Even pretty bad SCSI systems like the late Netburst based Xeon workstations and servers can use a lot more disk performance than an SCSI2SD—At least as it stands today, it is likely to get better, and the SCSI2SD v6 is said to be much better.

The other component of the discussion was essentially that there can be no vintage Mac without physical repair skills. It's an unfortunate discussion to have to have, because although I know a lot about Macs and I've got a lot of context on the history of the platform and the history of the community. Back in the day, when most of these computers were entering their second lives, my use for them was as my main computer. Today, as all of them are over 20 years old, my main use is pretty different. I'm doing fewer "modern" Internet applications and I'm using more pro applications and doing more things I wanted to get into when I had my older Macs, but couldn't for different things.

There are different reasons to use these computers, and not everybody is at the same skill level, both with Macs and with physical electronics tasks such as soldering or doing board level diagnosis as far as what particular chip it was that failed in a situation with particular symptoms. The problems with this are that it's exceedingly exclusionary and that it suggests that people shouldn't be able to enter the hobby without some way to gain skills they may not be able to get without first joining the hobby. The other problem is it excludes people for whom learning those skills doesn't make sense. I think there's a presumption among people that everybody will find soldering to be fun, because they do. I don't want to exaggerate here, but I just don't think this is true. If the knowledge and rate of learning of the people who show up about basic concepts of the platform is any indication, I'm not expecting many of the literal children to be able to pick up soldering very quickly, plus they're literally children so there's often the of being able to get supplies. At the other end of the spectrum is people who just don't have the time, plus other reasons why someone might not be able to do a bunch of soldering.

The trouble is that it's hard to deny that soldering is a valuable skill, and so because it's valuable, people extend that to mean that it's somehow mandatory. I'm sure everybody does this to a certain extent. It's a side-effect of the fact that we're all around for different reasons, and there is a correlation between people who place a high importance on soldering skills and people who happen to be good at it, either because they are more interested in electronics than computers or because they have enough vintage computing hobbies that they've entered one where it's inescapable.

But that doesn't help those of us with homes small enough we can't set up a soldering station, or who just never have the time or energy to do it, or who don't have the physical dexterity for it, and so on.

The SCSI2SD sort of ends up being a proxy for the conceit that everybody should have soldering skills. You can't hand-solder a SCSI2SD yourself, because it is assembled using a methodology that requires special skills, and buying an SCSI2SD requires the admission that existing storage technology and the availability and complexity of disks and adapters meant it was more worthwhile to replace it entirely with a device that accepts SD cards. I can see why it might not be ideal to admit that you've reached the end of a traditional solution. The other comment about the SCSI2SD was that it's expensive. It's more than $0 but it's not as expensive as many people think. The price has been coming down steadily over the years.

January 16
2016 in Macintosh

The end of the year has been unkind to Apple. Probably because Apple has done a lot of things this year interpreted as slights by user communities. Even among technical crowds and the tech press, often generally supportive of what Apple does.

Apple's main accomplishments of the year are, in no particular order:

  • New WatchOS and Watch hardware
  • Making iPhone users angry by removing the headphone jack
  • Introducing a true successor to the iPhone 5S and dramatically mis-estimating the demand
  • Introducing the 9.7-inch iPad Pro hot on the tail of the 12.9-inch version
  • Introducing questions as to whether the 12.9-inch iPad Pro had been delayed, and creating a situation where neither is a clear leader
  • Because the 12.9-inch iPad Pro has more RAM, but the 9.7 inch version has a better display by far
  • Leaving any discussion of updated Macs until the very end of the year, at which point new MacBook Pros with controversial new connectors
  • Failed to say literally anything about the entire rest of the Mac lineup

Apple has generally been unkind to the Mac over the past several years. It has been worse this year. Some of this has been a relatively natural progression of Apple's relentless search for sleeker computers and better integration. Some of it has been questioned over the years of course. The best examples I can think of are the removal of the optical disk drive in the iMac and the new change to USB Type C ports.

I'm usually fine with Apple's changes, but a lot of them feel like they're being done somewhat capriciously by a company that has had a little too much success predicting the future. I've argued before that some of the changes Apple has pushed over the years are damaging to users. Floppy diskettes were fine, I don't specifically mourn the loss of the optical drives, but it's starting to feel like Apple's doing things specifically to spite its users.

I think there's a few problems with it. I don't think Apple is truly contemptuous of its users, despite all appearances to that idea. I think the problem is that Apple massively mis-judged excitement for the new Touch Bar feature (as with Force Touch on both iOS and Macs) and in a presentation about a machine whose base price ranges from $1799 to over $4000, it seems tone deaf to spend literally any time talking about the machine's ability to do predictive text input (on a keyboard where people can easily train themselves to type at over 100 words per minute) and easily select from different emoji.

USB Type C is absolutely the future and annoying though it may be, I don't think Apple is wrong to jump all in on it. I do think that the pricing of the machines is wrong, and that Apple was probably wrong to leave the entire 2015 MacBook Pro lineup in place at their existing prices. (Perhaps to drive the point home, Apple should have discontinued the 2015 MacBook Pros and dropped the price on the 2012 MacBook Pro, the MD101LL/A, that was still on sale until the keynote.)

A lot of these things aren't inherently bad, but drama surrounding the MacBook Pro, combined with continued poor messaging has caused additional drama. On top of all of this, Apple has been almost utterly silent about the rest of the entire Mac platform.

The silence has probably been the worst. It is mostly well understood that Apple is, in a lot of ways, at the mercy of its suppliers for things like new generations of processor and graphics card. This is especially relevant in the Mac Pro, which is using 2012's finest Ivy Bridge-EP processors and also probably 2012 or 2013's finest AMD GPUs, Apple skipped Haswell and Broadwell for no discernable reason, so the hope is that now that AMD has Vega GPUs, Apple will update the Mac Pro to use those and the Skylake-EP CPUs due out a little later in the year.

Old and a little slow though it may be, the Mac Pro is still considered to the most capable Mac. It holds the most RAM, has the most ports, and it has that dual GPU configuration, if you use that kind of thing. It's a very good computer, but it only truly effectively serves one of Apple's many constituencies.

Chuq von Rospatch suggests of the Mac Pro and a potentially re-balanced MacBook Pro that prioritizes performance and capacity over thinness and battery life that it should be considered as a strategic machine. It should be the ultra-high end machine that makes video editors and developers of varying kinds happy to be able to buy. The argument here is that these are the people who influence the opinion of others and help them choose Macs. Without those folks, there are fewer people running around specifically evangelizing the Mac.

Technical users (and even people who power-use specific apps, but may not power-use the whole system) are often implicitly tech support for others around them, and in addition to influencing opinions, they influence recommendations for new technology purchases because it's easier to support the thing they use.

I have personally been disillusioned with Apple for a long time, although they are making extremely compelling hardware and their software (what little they still make, but Mac OS X specifically) is now better than it has been for several years. I sometimes find myself wondering exactly how many Macs I've encouraged or dissuaded over the years. I normally tell people that Windows computers are perfectly fine and if they already know Windows there's no specific need to get a Mac. Would I have said those things if I'd been using Macs?

To make things worse, Apple now has messaging trouble with the iPad. The 9.7-inch iPad Pro shipped only a few months after the 12.9-inch model, with a tremendously upgraded display, but with only ("only") 2 gigs of RAM. There continues to be consternation about whether or not the iPad, any model, deserves the "pro" tag. This is mostly the same crowd saying that Apple should stop selling the MacBook Pro under the "Pro" label. I'm mixed. It's clear that "Pro" has meant "nicer" for a while now. Apple advertises the iPad Pro hardware as being capable of a lot of very intense computing tasks. 4k video editing and photography chief among them, especially with the newly improved displays and more memory.

The trouble is that Apple has done a poor job shepherding the iOS ecosystem into something that can truly do this work. Perhaps Apple has something up their sleeves, but they've been saying this for years, and I've been writing about it for years, and as far as I can tell, network connectivity will never really catch up with demands on using iCloud/OneDrive/GoogleDrive storage for large data files.

An attentive version of Apple would realize that workflow and multi-tasking are important and build machines that cater to the needs of certain customers. It doesn't even need to be a big 2-socket, slots and disks workstation (although there are many who would buy that product, even if they did not strictly "need" it) – Apple just needs to say something about the Mac.

Almost more importantly than opinion-makers is the fact that a system like the Mac Pro is needed for the best experience when doing Mac and iOS development work. I don't know how Apple builds Mac OS X, but the logical choice would be to use some kind of Mac for the job. The Mac Pro makes the most sense, and if that's what Apple is doing, it would make sense that Apple has motivation to update the machine.

Granted, it could very well be that Apple is using servers from some other vendor to compile Mac OS X components, thus side-stepping the very issue that Mac users have, which is that there is nothing better than the Mac Pro for a lot of work, and building a non-Apple Mac "works" but is not at all a solution for people outside of Apple doing similar work, or different but just as demanding work.

The more I think about it, the more I think that there may be some room to re-balance what the Macintosh platform looks like. I think that there's room to re-position both the Mac mini and the Mac Pro upward. Apple could (maybe even "should') build a Mac mini that is a little bigger (perhaps even using desktop bits) and a new Mac Pro to sit above the current one with a little more configuration flexibility.

Another option may be to build both the Mac mini and the "Mac" on the same board platform, with the "Mac" coming in a taller enclosure that allows a better CPU and a discrete graphics card (basically 27-inch iMac specs) and replace the Mac Pro with a bigger, more flexible machine entirely.

The first step however is doing literally anything to acknowledge that the Mac is an important product, and announce some kind of plan to do literally anything with the veritable pile of 3-or-more year old products still being sold today. The MD101LL/A was a great joke and at a lower price point could have continued to make sense, but the 2013 Mac Pro is just sad.

January 09
Setting Priorities and Making Time

I originally set out several weeks ago to write a post about my long-standing desire to be a video content creator. I was going to write about how creative I'd been as a child and how much video crap I did any time I had access to such a camera. Then, I realized I was talking about prioritization, although indirectly.

I made plans to shift the post and create a video blog to talk both about that post and to possibly talk about my desire to post regular videos of some sort to YouTube, whether they're tech talks or about health or just a general video log of day to day happenings, I wasn't sure yet.

After, perhaps ironically, failing to write anything at all (for the good reason that I caught another cold), it occurred to me I should just rewrite (I had one and a half sentences done) with the starting idea that we all prioritize different things, and there are different factors leading to prioritization.

The background I was going to explain and work through a little bit in the previous post was that as a child, I found the process of producing video content incredibly interesting. I didn't think I would go work in TV, but I felt like I had a lot of stories to tell, and I frequently ran up against the limitations of both technology, and what I had. A friend and I made several short videos on a computer I had using Windows Movie Maker, a webcam, and a USB extension cord, and we had the better part of a longer story with more varied settings recorded on a DV tape, but the tape would get rewritten and it would be years before I had access to both the camera and a computer I could capture and edit with.

Once I got that, I was doing some more things, although the limitations of storage meant I couldn't do much. Later, I had limited access to video equipment for a while. Most recently, I have ample video, storage, and compute resources, but what seems like fewer ideas and less time.

I've been wanting for a few years to get back to video stuff, but the thing I don't have today is my attention. I have a few videos on YouTube, and I take clips on my phone and my camera from time to time, but even though I've wanted to for years, I've never had the wherewithal to keep up a video production workflow or a particular series of things for a while.

I've tried a few different things – video games, server administration stuff, personal blogs, one train video I managed to capture, among other things – and none of it has worked.

I think what it comes down to is prioritization. Every time I start to work on video stuff, it does "work" but I am not able to keep up with other thing I do, whether that's playing games, meeting up with people, writing, or just having mental downtime. Certain types of videos, I'm both mentally and physically unable to prioritize. I couldn't go make a bunch of train videos because those are quite physically intensive, and while I'm in a good place, I'm not in that good a place.

The other thing I worry about is my ability to create short fictional videos or periodic personal blog content. The world doesn't really need another "tech" channel or podcast, per se. I would want to find something unique, which might be easy to do in concept, but difficult to do in practice. Although it would be technically easy, I don't particularly want to be a talking head channel, certainly not on a technical issue.

But all these are excuses, and I think at the core of it, I know I could find something if I wanted to and put some time into it, and if I did, I could produce something somewhat reasonable, and get better at the craft. I could create a conservative posting schedule or even make things varied so I don't get bored.

I was originally going to make a video to pair with this post to make the announcement of regularly posted video content, but it never happened and to make it worse, this post was delayed by a week due to my usual winter troubles, minor illnesses made worse by the continued existence of my chronic illness.

We'll see if I end up coming up with anything. I was pleased with myself this past year for avoiding long and arduous hospital stays and with relatively few exceptions I made a blog post every week. It's down from where I was a few years ago when I was writing about music, but it has been good nevertheless.

Perhaps I will never get into video production, and my YouTube channel will remain as it has always been – a vessel for random things I see or do, for the times when something particularly good comes up. one of this will stop me from eventually spilling a few thousand words about the workflow challenges contributing to keeping it from happening, though.

December 26
The Loss of Floppies Was That Bad Though

Last week, I wrote about the Type C connector for USB, ThunderBolt, and other things being encouraging. The tech news media has been publishing article after article about how it's doom and gloom and about the fact that the market for peripherals is currently relatively nascent, and the standards are still evolving.

Near the end of that piece, I compared it to a previous big transition that Apple kick-started and forced: the removal of 3.5-inch floppy diskettes from computers. We've been discussing the issue in #68kMLA on irc.oshaberi.ne.jp on and off for a few weeks and I think it's worth writing something about it here.

Before their removal from computers, floppy diskette drives were essentially the baseline storage media. 3.5-inch, 1.44 megabyte drives had been common since the late 1980s on Macintosh as well as x86 "PC" platforms, and were available as upgrades to many other types of systems, although often at great expense. Early on, 3.5-inch diskettes were a huge improvement on the 5.25-inch media they replaced, and for years, they were considered enough. Until the mid-late 1990s, most (if not all) software (outside of CD-ROM specific multimedia titles) was available on them, and they were the most common way to transfer data or make backups of a computer.

I believe the removal of the floppy diskette (and in general, the market's failure to develop and implement a viable replacement before then) was much more damaging to computer users (financially and operationally) than the change to the USB Type C connector has been.

There are a lot of reasons for the failure to develop a new standard. I think a lot of it is basically a failure to recognize the floppy connector could or should be a generic thing. Serial connections on Macs were almost never used for storage, and although PC Parallel ports were used for various types of storage, it was nearly never particularly fast. Even if the controller could be adapted for use with newer drives, no newer drives were ever developed for Apple's floppy controller. (Notably, in the 1980s, there was a 20-megabyte external hard disk available for the port, but it only worked on certain early Macs.)

On the PC side of things, lack of a more universal and faster connector (HP-IB, aka IEE-488 has been suggested, because it was fast-ish, cheap-ish, and allowed for multiple devices to be connected) compelling for storage. SCSI wasn't common on PCs, mainly because it was difficult to set up and because it was expensive.

Several vendors floated ideas for replacements: IBM was playing with 2.88 megabyte drives and diskettes on its own PS/2 and ThinkPad computers, and Sony developed the MiniDisc and MD Data (140MB) format in part to be used as a data drive, and floptical technology allowed for 21 megabyte cartridges in a similar footprint to 3.5-inch diskettes. Later in the '90s, there were other formats such as SuperDisk/LS-120, Zip 100-250-750, but I don't have information on which of these were intended to try to supersede the floppy disk as a standard, and how many were available as an expansion system, the way that EZ-135 and bigger systems such as Bernoulli, Jaz, Orb, and SyJet were.

For a superfloppy format to become standard, PC OEMs and Apple would need to bundle the drives (at least) with systems, ship software on it, and it would need to be or quickly become very inexpensive, and also easy to add to existing computers. Unfortunately, none of these systems ever met those criteria. Zip 100 was probably the closest to do. Unlike the Ditto, Iomega never built a version for the PC floppy connector, but parallel port and IDE/ATAPI versions allowed inexpensive options for PC, internal and external versions for Macs were common, Iomega was among the first to release USB storage devices with the Zip 100, and by the end of the '90s, Apple was in the regular habit of making several of its computers available with internal Zip drives. PowerBook accessory manufacturers also built versions of the drive to be installed into Apple's laptops.

We had a Zip 100 drive attached our PC at home, and when dad got tired of the parallel port version, he installed an ATAPI model and I used the parallel port version with whatever PC equipment I had around, and I had one disk. Later, I inherited a USB Zip250 drive and a Mac I had inherited the old IDE/ATAPI Zip100 drive, making it practical for me to use them. (Almost annoyingly, though, because by then I had an Ethernet router and three or four computers connecting to one-another with file sharing anyway, plus a DVD burner on the PowerBook I had.)

Rewinding a little, though…

My own reaction to the removal of the floppy diskette drive was delayed, because I didn't have computers without them. Even with floppy drives available though, my software ecosystem was relative weak, and I didn't have very many diskettes or a good way to organize them, physically.

One of my strongest memories from this era was when I asked a friend of mine who had access to more Internet connectivity to download an MP3 file for me (1). The file was something like 3.4 megabytes, and needed to be split across 3 or 4 floppy diskettes. He used PKzip on a Windows computer to compress and split the file, and then handed me an envelope with the attendant diskettes in it. I had to hold on to those diskettes for like a month while I frantically searched for a program to open that data on a Mac.

I eventually got the file open and could listen to the song. I didn't think about it in this way at the time, but the limitations of floppy diskettes had been holding me back for some time. Other ecosystems for large data transfers and backups had existed for a very long time, but most were very expensive, many were discontinued by the time I was using computers, and things like external SCSI hard disks, Zip100 mechanisms and even SyQuest and Bernoulli media were uncommon at garage sales and in used computer retailers at the time. I'm convinced most people were keeping these peripherals and moving them from system to system at the time.

I could use floppy diskettes to transfer data, but I didn't have the context or a good way to find out the paid version of Stuffit was one of the best ways to put a large-ish amount of data on floppy diskettes. I moved some data from one machine to the next, but not a whole lot.

The situation got worse when I picked up a used iMac a few years later. It didn't have an iMac and it wasn't in the cards for us to even get a diskette drive for it, let alone, say, a USB Zip100 drive and one or more SCSI Zip100 drives for my other Macs. The iMac had Ethernet, but none of my other Macs did. The iMac had a modem, but none of my other Macs did (until after I got the iMac, at which point I found a 56k external modem) but I never mastered the art of dialing from one computer to the next. I should have been able to do it with real phone lines, but I was hoping for something involving a direct connection, for performance, reliability, and timing reasons. (It would not have been a leisurely experience to try to transfer a gig of crap from my Power Mac 7300 or Quadra 840 to the iMac via phone lines, but if I could have done it directly, the speed would not have mattered, and I could move things around much more frequently.)

Even if I had been able to buy a Zip or SuperDisk drive for the iMac as well as for my other computers, I would not have been able to use that to transfer files to other people. Literally nobody I knew had a SuperDisk drive, and my Mac SuperDisk mechanisms would not work on their Windows computers. I knew one or two other people with a Zip drive, but it was nowhere near universal, and there wasn't one at school.

The first USB thumb drives weren't available until several years after the iMac was, and even though Apple wanted you to believe that the "i" in iMac stood for the Internet, it wasn't really practical yet. Several years later (in 2000), Apple would address the issue with the iTools service, which did include iDisk, an online storage space. It wouldn't become particularly practical until a little later than that under the .Mac moniker. With .Mac, Apple provided some additional software (Anti-Virus, extra Garage Band content, and Backup) as well as space for a web page, an e-mail account, and critically, an online storage space that presented itself as a mounted disk on your computer. You could use Backup to make backups to either the iDisk (Apple's name for the storage space), an external hard disk or to writeable CD or DVD media.

.Mac only offered ten gigabytes of space. It would have been enough to back up writings, a few critical photos, a modest mail database, and perhaps your taxes or other financial data, but you would have been hard pressed to put a whole bunch of video up on it. It served the need though. Just the previous year, Apple released the "DV" version of the iMac G3 computers, which had the capability to capture and edit DV video.

The real trouble with this is that most people in the US were still using dial-up Internet at the time. I never had an iTools account, because I didn't have Internet connectivity at that moment, and I didn't bother using .Mac because it was costly, and I couldn't make very good use of it on my dial-up connection anyway.

Other online storage services wouldn't show up until almost ten years later when Microsoft SkyDrive and Dropbox showed up in 2007. Those services function differently, however, by syncing the contents of the storage space to your local computer. This made it a little better to sync on slow Internet connections, which made things simpler but perhaps not better.

Many people say that it was rewriteable CDs that ultimately replaced floppy diskettes. There is some, but I would argue, not a whole lot of truth to this. For starters, CD-RW equipment was uncommon and cost a lot more until much later. You couldn't easily put just a few files on a CD. You needed to keep the amount of space for your CD available on your main hard disk, because the process worked like this:

  1. Gather a bunch of files.
  2. Put them into a burning program like Toast or Nero, or on a disk image, with a Mac.
  3. Close all other tasks and applications.
  4. Burn the CD, make sure to test it.

Burning a CD (or later, a DVD, which needed even more disk space and took more time and was more expensive) wasn't just something you could do because you forgot to print your homework on your way out the door before school. It was a big project each time, and it was something you did maybe once every few weeks to get old junk off your computer, not something you could do quickly. There are re-writeable DVDs and CDs, but they are incredibly inconvenient for this task. If you carry a word document or a presentation to a collaborator's computer, they need to copy that file (and any others) off the media, then make their edits, and then completely re-burn the disc.

Flash disks came onto the scene in the early 2000s, but I don't think they were practical until a few years after their introduction. I came by some flash drives in the 64-256MB range in 2006-2007, at which point, networking had come out and in 2007, both OneDrive and Dropbox launched for free, offering a few gigs of file syncing and sharing space.

In the end, there were (and are now) lot of solutions, but most of them cost a lot, took a long time to materialize, and many of them presumed things that weren't true in 1998, and weren't even necessarily true in 2006. When I got a flash drive, it fell right in with what eventually materialized as my standard workflow. The flash disk was always a swing space, used for storing working copies or copies of only specific documents for transmission. My main document store was on the hard disk of my laptop, with monthly "Junk archives" going to DVD-R, and backups of certain files periodically going to DVD-R as well.

Later, buying big external hard disks became more reasonable, and even further after that, in addition to counting on destination computers having USB ports being a reliable thing, you could usually rely on other locations to have an Internet connection. In addition to all of that online storage spaces have become larger and cheaper. For backups, various cloud services exist which charge less than $10 a month. Some have gotten the cost down to below $5 monthly.

The challenge now is that we have an utter glut of good options for massive amounts of data storage. Pocketable, multi-terabyte hard disks cost approximately what just a few 100 megabyte cartridges did a few years ago. We're at the point where it's generally reasonable to presume that if you can buy a computer you can buy some kind of data storage for it.

It's hard to decide, if you have a computer to back up or if you need to preserve some photos, whether you should use the money on a cheap subscription service, or an external hard disk every year or so.

To use these cheap external storage devices with a new Mac, you need an exceptionally inexpensive adapter, which you buy a single time. Perhaps you buy two if your computer has more ports. You buy the new adapter a single time and then you use it as long as you have any devices that use that type of connector.

It will be inconvenient for a few years, and then new peripherals will come along and be purchased to use the new connector. The new connector will presumably last a long time, long enough for the vision of the standards to be finished and bad cables or improperly implemented ones to be weeded out. In all, I am confident it'll be a lot less damaging than the transition away from floppy diskettes ever was. There will, for starters, be a cheap, easy way to connect existing hardware to new machines. There are already peripherals with the appropriate connector, and as far as I have seen, they do not cost more than existing peripherals.

It will be a lot less bad than the removal of the floppy diskette drive.

 

  1. It was "Changes" by 2Pac Shakur. I had been listening to a MIDI rendition of it for a while, and we were both surprised when we heard the real song.
December 20
Type C Is Not That Bad

It's relatively well known that I'm pretty excited for our unified connector USB Type C future.

The industry is off to a rough start with this new standard. In the time before it thought anybody was paying attention, Apple built a few non-compliant cables and the original 12-inch (Retina) MacBook used some methods to charge that weren't quite compliant with the Power Delivery spec. In addition, systems like the Lenovo Yoga 910 introduce weirdness into the whole thing by using the Type C connector for ports that do certain things, but not others, for no discernable reason. The biggest of these is that it has a dedicated charging port which doesn't support display or USB functionality, and one other Type C port that performs functions other than power deliver.

In addition to the confusion between USB Type C and Thunderbolt 3 and the different things some laptops do (such as USB power-only connectors) there are different situations concerning things like video output. How do you get HDMI or DisplayPort video out of a Type C port? There are alt-modes for each of these things, but not all systems support all of the alt-modes. You could use a DisplayPort to HDMI adapter, or there's also things like DisplayLink adapters and docking stations. All of this will also depend on how displays are configured within a system. I suspect that outside of laptops and certain OEM computers (probably specific genres, like NUC or Mac mini style computers), our existing video connector situation is here to stay.

There is a lot of consternation in some circles. (Engadget and other similar tech sites also had a fit when Apple made it harder to remove batteries in 2010 or 2011.) The tech press and reviews of systems like the new MacBook Pro family of systems from Apple have been quick to point out what a mess everything is. I'm convinced that things aren't nearly as messy as we all want to believe though. For starters, Apple's Type C ports (mostly, 1) don't discriminate.

I don't think the situation is as dire as many want to think. Laptop users have needed to carry dongles since literally forever. In the '90s, Apple's laptops used a unique display output connector that nothing else they built used. The cable wasn't included, it was an add-on you needed to buy separately, and there was only one cable available, one that adapted the proprietary port to use a Mac monitor.

Today, USB Type C is an industry standard that should work not only with Apple's own adapters, but with third party products like the LG UltraFine displays, StarTech and Anker adapters, and so on. I think the key is that the industry needs to decide quickly what the best strategies for things like "HDMI output" are and then retailers need to do a better job only stocking compliant cables, chargers, and other accessories. People talk a pretty big talk about definitely needing all kinds of things on their laptops. Heck, at a previous time I was a big cheerleader for this kind of thing. In the next few years, I think people will develop a small, but growing and ideally good collection of accessories until such a time that it starts to make sense to buy things that use Type C directly.

There's a lot of potential for USB Type C to be a much better technology for all of this kind of stuff than we've had before. The LG UltraFine displays, for example, are 4k and 5k monitors using DisplayPort and ThunderBolt transport, with Power Delivery to charge and run a system, and a hub for more peripherals. This is the final form of something Apple has tried over and over to do with varying levels of success for at least 20 years(2). These displays could even be the power source for small desktop computers, such as the Mac mini or the Intel NUC. These systems generally use less than 100 watts of power and would be great candidates for a single cable connection to a monitor that also provides power and a USB hub.

In terms of what exactly people will end up getting, I think it's just a matter of what the needs are. I'm confident that I could take a new MacBook Pro (or similarly equipped Windows computer) out of the box and get to work with no real trouble. I never use wired networking on most of my portable computers today, and the only real peripheral I regularly use with my Surface 3 is a Bluetooth mouse. I know that I would end up with either a multi-port adapter (to get HDMI, a type A, and a pass-through Type C connector) or one or two type A adapters, and I would look for cables such as the C to Lightning adapter for my iPhone and a C to Micro USB 3.0 or two to use with my large fleet of external hard disks. Long-term, I would probably buy the SanDisk SD card reader, and look at migrating my Mac data to a disk like the Seagate Innov8. Drobo also has a new model with a Type C connector available.

In the future, I think we're going to see some interesting stuff. I am still hoping for some creative docking options. A dock with Ethernet, a few Type A ports, power delivery, and a hard disk would be useful for backups of the laptop itself, for example. I also think that we will see more things trend toward being wireless. Canon and Nikon have offered wireless transfer functionality on their professional cameras for years, and so I imagine this functionality will start being built in and becoming better.

Western Digital already sells a portable USB hard disk that has a battery, a wifi radio, and an SD card reader. You can connect it directly to a system (and, I have considered getting one for use with the Surface 3) or you can put it somewhere and connect to it as a NAS, either by joining both your client computer and the disk to the same wireless network, or by joining your client computer to a wireless network the disk hosts. That configuration could be useful in a field scenario where multiple people use the system to share transfer data. A tool like that works with whatever computer you have today, works without any adapters or cable on a new computer that doesn't have a normal USB Type A port, and will perform the functions of both an SD card reader and an external hard disk using a new Type C to Micro USB 3.0 B cable. It's not hard to imagine Western Digital will build the next version of this device with a Type C connector.

There will be switching costs, but I don't think that the ecosystem is in anywhere near as much trouble as people claim it is. I think that the opportunity to improve the experience of certain devices massively (I'm looking at you, Surface Pro) by including two or three USB ports instead of three separate ports that each only do one thing) is highly enticing. I don't think we should overlook the idea that a future Surface Pro with three Type C ports might be much more easily able to run an external hard disk and a full-sized media reader at the same time(3), making it that much more viable for things like a photo management workflow in a way that you "can" today using a USB 3.0 hub, but probably shouldn't. You either need a powered disk or to commit to a powered hub (or both, at least with the Surface 3, which struggles providing a whole lot of power out of its single 3.0 port.)

I do think that it may have been better if Apple had packed at least one USB Type C to Type A adapter in the box, but suggestions like that are kind of a slippery slope, because Apple might also have included one of the MultiPort adapters, and a Lightning to Type C cable, and a Magic Mouse (which would justify the lightning cable) and so on. Of course, Apple makes those things available, but doesn't itself bundle that accommodation.

The other thing is that I think this whole thing will be an awful lot less damaging than when Apple removed floppy diskette drives from its systems. That, coupled with data that people were collecting becoming bigger anyway, and there being no clear, widely deployable, inexpensive replacement that would be a reasonably universal way to move data around meant that it suddenly became very difficult to do things like write an essay on a computer in your room and print it on a computer in the living room, or give a few music files to a friend. In the late '90s and early 2000s, it was hard, if not impossible to use Internet services for these tasks (not to mention impractical) and if you bought into a storage ecosystem such as that of a drive like Zip 100, LS-120/Superdisk, or something similar, then you weren't guaranteed that your friends and acquaintances had the same drive. If you bought an external one, you could take it to their house, if their computer had the same connector yours did.

I know the impact for me was that I had little computing islands. My Macintosh Performa 600 and Quadra 840av could share data, via either a LocalTalk connection through their serial ports or with floppy diskettes, and those two could share with my parents' Windows PC, but none of those things could share with my iMac, for which we didn't have any USB peripherals or other adapters. Those things were available, but they didn't always help. A USB floppy diskette drive could run you $100 or more over the retail cost of the machine. A USB Type C to Type A adapter or a new cable for a peripheral is $9, presuming you even need it, between wireless and networked printers, and online storage services that can quickly be used to transfer files from machine to machine or person to person.

What was even more annoying is that my parents' computer did have an internal ZIP100 drive. I had my own disk and I eventually inherited a Parallel port drive for a PC I had, but it wouldn't be for several years after I had gotten rid of that iMac that I ever came across a USB zip drive. I still haven't ever used a SCSI one. Making that investment at the time would have made a material change in how I used all of those computers, and I'd probably still have a lot of my data from that time period today if I'd had that kind of tool available. (Of course, what I really wanted was to put Ethernet in everything and to run a Windows NT 4.0 server, but I failed to get funding for Ethernet cards, a hub or switch, and cabling in the same way I failed to get funding for ZIP or other larger removable media.)(4)

When the time comes for me to start using Type C computers and peripherals, I'll buy adapters and cables, and new peripherals I get will be designed with Type C in mind, and I'll make do. It won't fundamentally change my life, even as somebody who had previously heavily advocated for highly expandable machines with disk bays and slots and a wide variety of different ports, all "just in case."

The world has changed, and our computers should probably follow.

 

Endnotes:

  1. On the 13-inch MacBook Pro with the Touch Bar, the particular Intel platform Apple is using does not have enough PCI Express lanes for both Alpine Ridge controllers they use to be fully equipped. The ports on one side of the system are only capable of half the performance of those on the other side.
    Also: the 12-inch MacBook Pro lacks an Alpine Ridge, so it can use USB Type C devices, but not ThunderBolt 3 ones.
  2. In 1993, the Apple AudioVision monitor provided a one-cable connection for video, ADB (keyboard/mouse) and sound to a Mac. The AV Power Macintoshes in 1994 were supposed to add some video input lines for use with conferencing cameras. The cable was completely massive. In 1996-7, the AppleVision monitors carried ADB/Sound/Video on a single smaller cable, using all normal connectors. In the early 2000s, ADC carried digital video, USB, and power for the monitor over a single cable. After that, Cinema displays had a single cable carrying USB, DVI video, and Firewire. After that, the mDP and Thunderbolt monitors added a Magsafe laptop charger for "docking."
  3. I should note that Seagate recently announced a line of desktop USB 3.0 hard disks that have a USB hub in them. I have an 8TB example, but that's an SMR disk and is unsuitable to things like video editing or managing a photo library.
  4. A large part of that of course was that I was literally a child, and I don't think I was good at expressing that I was using the machines for productivity. I was also very low on floppy diskettes, nor did I have any archival software to effectively use floppies to transfer more than a single meg of data.
December 12
Hatt is Immortal

I'm writing this well in advance in an attempt to provide blog coverage during NaNoWriMo. This is a softer subject than my normal posts, mainly because I can't exactly prognosticate what Apple and Microsoft will each do on October 26th and 27th.
I missed some publishing targets in November (and didn't finish writing a few things), so this is going to come out in December.

Every now and again, I look for Thomas the Tank Engine stories on YouTube. I used to be a big fan of the stories on video tape and during Shining Time Station, and I had the big anthology of the original illustrated stories.

The original illustrated stories, and the ensuing video story series was compelling because they were based off of real events. As the video series licensing agreement changed and as the company licensing it needed more stories, the content in the series changed.

There was a period in the late 200s and very early 2010s when the stories were pretty thin. The format at that point time was essentially, an assignment is given, something is done wrong, then done wrong once more, then a third time. Confusion and delay ensues, corrective action is made, and then the day is somehow saved. The most recent of the show episodes and the movies are structured much better, and there are several instances of legitimately good storytelling. It's a refreshing change from the three strikes structure.

For a long time, I wondered about the economic viability of the island. No governance structure has been set up (as far as I can tell) other than that one of the towns has a major, but it otherwise looks like the railway runs the island. There's better discussion of this than I think I can do at the moment.

This post is about some information that came up in a recent episode. I have a new wacky fan theory that Sir Topham Hatt is, in fact, immortal. Thomas is considered both in Awdry's stories and in the recent movie "First Adventure" to have arrived on the island in about 1923. At the very tail end of this story, Thomas greets the previous #1 engine, a "coffeepot engine" that was already on a siding, having been long forgotten at the start of his branch line.

In a recent Season 20 episode of the show, set in the early to mid 1960s at latest, given that Daisy the Diesel Railcar (based on BR class 101, built in the 1950s) is many passengers' first experience with that type of vehicle, the coffeepot engine, revealed to be named Glynn, is discovered and Thomas and Percy spend effort trying to get him repaired. The big dramatic reveal in the episode is that Hatt (who, I'll note, looks the same both in 1923 and in the 1960s) built Glynn at the founding of the North Western Railway, but a date or time frame isn't given that.

Now, Hatt is clumsy and has his own health issues, but nothing generally commensurate with what I'd suspect would need to be about an eighty-year life, if you consider the following:

  • You probably need to be about twenty years old to build a steam locomotive and found a railway in England at the end of the 19th century.
  • Glynn had a "long" and productive service life, but none of the other engines anywhere on the island make a reference to recognizing, and all of the ones who would have (Edward, Henry, Gordon, James), saw him in one of the two episodes so far featured. Call it about 20 years, at least based on the needs of US Class 1 railroads in modern days. It was probably actually longer in early 20th century England, especially on what is essentially a short-line that generally accepts or buys second-hand equipment from the companies that went on to make up BR, and BR.
  • In 1923 when Thomas sees him after his first few weeks on the island, Glynn has already been mothballed for a very long time, following the end of his primary service life. Even in 1923, there was ample space shown on the island to store a just recently disused engine, and the railroad is portrayed in almost every story as being perpetually short of power. If the North Western Railway was real
  • Hatt hasn't visibly aged between a story set in the early 1920s and a story set in the early 1960s.
  • Lifespans in England in the early-mid 20th century are going to be shorter than they are today and visual degradation of appearance is going to be more rapid and noticeable.

Take that for what you will, but it looks clear enough to me. The only reasonable way Hatt could have done this is if he were immortal. No discussion is made as to what possibilities there are for this. It's not Hatt's father or a cousin, and Hatt makes no denial of the idea that it was him that built Glynn.

It would be interesting to see more stories from the earlier days of the railroad, especially with the modernized storytelling and visual formatting of the show. The story of how Percy came to the railroad has yet to be re-told, for example, and I think that some of the other daring rescues over time would make great episode fodder.

I also know that there's some newer Awdry stuff that would do well in the new format, and some possible retconning to help alleviate the situation of Diesels vs. Steamies that has evolved in the new show. For example, if BoCo were to come to the new format to help Bill and Ben, along with Derek (the Paxton Class 15/17 diesel) were to show up, and perhaps continue having reliability problems, but be friendly to the existing cast, that would go a long way toward the show itself promoting literally any kind of diversity.

Back to Hatt: it's a little hilarious to see all of this stuff in such quick succession. The movie showing Glynn having already been long forgotten (at which point we'll have to guess Hatt was 60 years old or more) came out just a few years ago. It Hatt is already 60 in 1923, by 1963 he is a hundred years old. The Island of Sodor is a magical place, but is it that magical?

December 05
New MacBook Pro

I'm prewriting several posts just before the start of November, for purposes of NaNoWriMo. Fortunately, a fair amount of interesting tech news has just come out. As a follow-up to my post about the Surface Studio, I'm going to write about Macs now!
Further - this post was supposed to be posted a few weeks ago, but I forgot to, and so here it is now!

Apple announced its first new MacBook Pros in several years. Some of the "outgoing" models were three years old, having been originally announced in mid-2013, only refreshed with a newer GPU in about 2015. Most of those models are, because of reasons, still on sale.

The basic facts are that the 11-inch MacBook Air, as well as the MD101LL/A are discontinued. The discontinuation of the MD101LL/A means that the Mac Plus still holds the title for the longest running Mac model. At a bit over five years, I'm not sure that even the Mac Pro will be able to beat the record.

Then, there are the new machines: There are three new MacBook Pro models. One is a 13-inch MacBook Pro with two ports and no touch strip, and a 13- and 15-inch machine, each with a touch strip and four ports. The ports on all of these models are USB Type C connectors that are also enabled for Thunderbolt.

The systems have faster solid state disks, Skylake processors, newer graphics hardware, an updated keyboard, and two of the three models have the touch strip. The machines also all have DCI-P3 capable displays, for better color when working with photos shot on the newest iPhones and iPads, professional cameras, professional video cameras, and better fidelity when creating or doing things such as color correction. The memory capacities and types stayed the same, mostly so Apple could use low power memory. The systems are also a little thinner.

In some of my circles, the reaction to the discontinuation of the MD101LL/A and the fact that the new systems ditch several "legacy" ports (except, oddly, the headphone jack) and the fact that most (if not all) of the systems have everything soldered on their boards has been swift and furious. Most are now convinced that "Apple doesn't care about the Mac" and that Apple's target market is obviously hipsters and (paraphrased) dumb millennials

This sort of angers me for some reasons and just confuses me for some others. The first and perhaps most important thing to me is that I consider it insanely condescending that people watched the demo of the touch strip and the biggest thing they got out of it was that you can use it to browse and select emoji. I think there's a real but unfortunate group of people who thinks that "normal people" shouldn't have access to computing at all. It's basically the same crowd who is angry that the majority of computer users no longer know how to program, and the different but equally reactive crowd who thinks it's a shame that computers are being used by people who aren't running content creation and scientific or technical computing software all the time, and the (yet again) different but just as loud crowd who think that accessible, pocketable computers that are always on a network and can be used for communications (smartphones) are ruining society.

To be honest, I'm just about completely done with listening to people rail against modern computers because their Aesthetic Preferences™ suggest that this shouldn't be what computing is, or what having affordable computers does to society. At the same time, this crowd often talks about computers being too expensive anyway. Ironically, they might get what they want.

The other crowd has what I'd call legitimate technical concerns about the machines or the platform. For example, with everything soldered, there are no aftermarket upgrades and no replacements of failed components. The switch to USB Type C is an overall positive from the perspective of being able to charge up the device in emergency situations, and for better choice. Right now, Apple has the only above-60-watt Type C power delivery adapter on the market, but that should change relatively shortly, and even in an emergency situation, a Type A to Type C power cord or a lower wattage Type C adapter could mean the difference between productivity and being dead in the water.

Consider the following would-be apocalypse scenario for a Mac user who travels for whatever reason to a place like Las Vegas, NM. The cord on their MagSafe adapter fails. It is either a safety problem to use it or it simply won't charge at all. This is a pretty small town. There's no Best Buy, no Staples, their college doesn't have an Apple Campus store, and the people they are visiting all use Windows PCs. Their MacBook Pro will be dead in a few hours, there's nothing they can do to stop it, and the closest towns with Macintosh hardware on sale are hours away.

For the user of the 2016/2016 MacBook, or a new 2016 MacBook Pro, you can at least hop into the local cell phone store or Walmart and pick up something like Google's 18-watt adapter for the Pixel phone, which should provide a reasonably useful charge overnight, and keep the machine topped up enough throughout the day for a pretty normal usage pattern. We don't yet know if there is any provision for running the machine from more than one charger, but if there is, you'll be able to more usefully run and charge the machine. Targus also sells a 45W Type C laptop charger.

For the user of an older MacBook Pro, the solution would be to either do without the computer, drive a few hours, or buy a cheap Windows or ChromeBook computer at the local Walmart, hoping that your data is on some sort of storage media outside of the computer. The most common Dell and HP laptops sold for the past ten to twelve years all use the same two power connectors (each) and those adapters are usually available at Walmart.

The other thing we've seen with USB Type C has been the ability to power bigger peripherals directly off the bus. Things like the Seagate Innov8 hard disk let you run a big external hard disk off of bus power. On the desktop side of things, I'm hoping that this means big powered Type C hubs, or future desktops with several Type C ports allow for several of these disks to run without external power. It could be valuable for a laptop to be able to do that with a bigger disk, to reduce the amount of stuff that a person will need to carry with them, as they're using that machine and disk.

The fact that Apple is now on Thunderbolt 3 may provide some other interesting benefits in terms of things like being able to run external GPUs (or fast storage arrays) on Macs. Although it could be dangerous, politically(1) for Apple to support this, I really hope they do on Macs that get TB3. This could be a big boon, especially to iMacs and MacBook Pros, for docking as well as acceleration. There are a fair number of Mac applications at the moment that now support OpenCL, and it's possible more cuild be built. There's no reason that a neatly packaged Thunderbolt 3 accelerator like a Xeon Phi(2) couldn't be built, if Intel thinks that it can use that to better effect for OpenCL acceleration.

I think the biggest legitimate complaints about the machines so far are the removal of the escape key, and the fact that the new models are appreciably more expensive than the machines they replace. (There is also a new hole at the $899 tier, left by the old MacBook Air 11.6-inch.) In addition, in true Apple fashion, its first party power adapters and cables are not only more expensive than the previous generation of MagSafe adapters, but are, as with the 29W adapter for the MacBook, packaged separately. You can save money if you only need to replace the Type C charge cord, but it costs a lot more than before to buy a new charger to, say, leave at work, or semi-permanently leave attached to a particularly hard-to-get outlet. There's also the concern for people who regularly use a lot of USB, Firewire, and Thunderbolt or Mini DisplayPort peripherals. This isn't new to Apple, but there will be much consternation while people spend a fair amount of time accumulating enough new cables and adapters to feel comfortable.

One thing I wish Apple had done with its power adapters, something Intel did with the power adapter to the Core M3/M5 version of the Compute Stick, is to add USB 3.0 type A ports to the charger. This could have helped assuage certain doubts about USB Type C, and also function as a rudimentary dock. Plug your printer and an external hard disk into the power brick, and you get printing and you get a one-cable connection to a few important things. Perhaps even sleep mode Time Machine backups when you're plugged into that particular brick. With three ports in some combination, you could get an external hard disk, a printer, and something like an Ethernet adapter. I'm waiting, third parties.

  1. I think the biggest trouble is that this would allow a software vendor such as Adobe to sidestep the need to port GPGPU functions from CUDA to OpenCL. When the original Mac Pro was released and then reviewed in 2013 and 2014, Thunderbolt 2 was not officially supported for external GPUs, so Adobe wasn't allowed to tell users complaining about OpenCL to go buy an nVidia GPU, put it in a box, and plug it into their Mac to get CUDA.
  2. It's no big secret that I love the idea of cabled outboard accelerators for media and technical-scientific acceleration. They're a good idea for laptops and mini-desktops, but I've got an appreciation for the idea when used in conjunction with all-in-one desktops and compact workstations like the Mac Pro, as well.
November 14
Surface Studio

I'm prewriting several posts just before the start of November, for purposes of NaNoWriMo. Fortunately, a fair amount of interesting tech news has just come out.

Microsoft has made some announcements of late. The show on October 26th talked about a few different things. The general overview is this:

  • Overview
  • Paint, 3d creation in "Windows Creators Update"
  • Surface Book
  • Surface Studio

First up is Windows and Paint: It looks like the next update of Windows is going to have a "creator" theme, referring to it as the Creators Update. It'll be interesting to see if this is how Microsoft conducts each of these updates going forward. Show off a new first party UWP app and then make that new app the entire update, and perhaps even show off new or new partner hardware pursuant to that particular goal.

The new Surface Book is the Surface Book that makes me want a Surface Book. It has a faster graphics processor and better battery life, plus the display is at least as good as it always has been, and it folds more closed, making it more traditionally "laptop" while still having the really unique hinge mechanism and the ability to remove the display.

The real star of the show was almost certainly the Surface Studio. It's already generated some fairly interesting (and admittedly quite intense) discussion in my circles online. The jury appears to be out on it.

Personally, I'm in love with the thing, and this hardware near perfectly fits what I think my actual computing needs are these days, in terms of horsepower and capabilities, in a form factor that's better than what I could have asked for if I'd written it down somewhere.

The Surface Studio is both simple and complicated to explain. At its core, it's a reclining all-in-one computer with a 28-inch display that has a very high resolution. It's got a quad-core laptop CPU, a beefy graphics card, and some ram and storage. The port selection is useful, although perhaps it's just a little bit bare.

Microsoft describes it as a "Powerful workstation designed for the creative process". That's not too far from what they deliver, in terms of computing, and although as people who watch the computing space will say over and over again, all-in-ones are unpopular, I think what's really meant by that is that all-in-ones sell poorly outside of home and in certain creative and educational environments, in part because many environments prioritize providing computing in a certain way. (Basically, the larger the environment, the more likely the computers themselves will be generic slim desktop or minitower systems, probably with slots, that can be bought en masse and then if necessary configured for each user.)

So, I think that Microsoft's target market is basically the content creation professionals Apple is said to be throwing under the bus with the Mac. This is basically going to be the same appeal as the UNIX workstations and servers of yore. As of today, Microsoft's Surface lineup includes the Pro4, the Book (2?), and the Studio, but Windows itself runs on a big variety of computers, from $99 tablets and ARM based cell phones all the way up to much more powerful computers with more expansion than the Surface Studio or Apple's iMac and Mac Pro.

Basically: Anybody in one of Apple's introduction videos for the Power Macintosh G4 or G5, whose media was print, web, photography, video, as well as science, technical design, and heavy compute users such as people running searches and transforms on genomes. Compare that to the group talked about in the introduction of the Mac Pro, all content creators, but almost entire media people. Basically, the new Mac Pro is targeted directly at 3d and 4k video professionals.

I suspect that in addition to these particular content creation professionals, there will be a fairly heavy contingent of the people I've been talking about in certain contexts for a while: Windows users who are tired of the PC OEMs, who don't necessarily want to build their own systems, who want Apple's hardware quality and support. Basically, I think there's a relatively strong contingent of people who have been clamoring for a Microsoft iMac for a while. Some of them may have preferred a tower, but I think the Surface Studio (and to a lesser extent, the refreshed Surface Book) will give them what they want.

The trouble is, at $3000 for a base model with an i5 CPU, 8GB of RAM, 1TB of storage and a 2GB GPU, it's a steep buy if you aren't really using the display and the pen actively, especially given that none of the models gets you a pure solid state storage configuration.

The system is more or less well equipped, except for a few kind of odd omissions, such as Thunderbolt 3, for the addition of even more graphics horsepower or faster storage, any kind of USB 3.1 or Type C connectors, and if they're selling it as a workstation, an option to switch over to the Xeon Mobile CPUs and ECC memory may have been nice. The graphics are also of the older Maxwell generation, rather than the newest Pascal generation.

If Microsoft has their target market picked correctly and their marketing messaging is working, I don't really think that the people they're trying to get onto this system will care. I hate to say it, but one of the things said most often of Mac people is that they are people who just want to do work. Most content creation software actually works well on ultrabook class hardware, much of it is about as fast as you'd ever expect that kind of application to be. The Surface Studio can outperform that class of hardware in spades, and it still has several key pieces of hardware.

Firstly, nVidia graphics, even the outgoing generation, is critical to getting Adobe Creative Cloud users onto the new system. One of the biggest points of contention with Mac OS X of late has definitely been the fact that Apple's rigid insistence on using AMD graphics cards in its systems, along with Adobe's rigid insistence on not porting anything to OpenCL means that Mac users on modern hardware get almost no hardware acceleration when it comes to doing certain tasks in real-time or rendering things, particularly in Premiere and After Effects. The short version is that the combined horsepower of two FirePro cards means nothing more to most of Adobe's software than the HD4000 graphics on the old MacBook Pro. (Or, if you prefer, a Dell OptiPlex 7010 or 9010 from several years ago.)

Apple's own Final Cut Pro relies heavily on background rendering as well as OpenCL, so it runs very well on Apple's hardware, while Premiere runs about as well as it did on a midrange business PC from four years ago. It's not particularly inspiring, especially if you rely on Adobe's creative software in your day-to-day.

It's said that Premiere Pro and the other Creative Cloud applications have improved in their support for OpenCL, but I haven't been able to verify. There's also the trouble that Creative Cloud itself really doesn't get extensively tested every version like it did before, because there isn't necessarily a good way to define what a version of the software is, and there's not necessarily a very good way to get onto a baseline version of the software that has been reviewed. If somebody wanted to put in the effort though, I'm sure it would be appreciated. The other problem is that long-running computer models, such as the Mac Pro and iMac against which the Surface Studio will be competing, haven't themselves been reviewed in a while. It's a shame, because Mac OS X has been updated (three or four times) and most, if not all, reasonably mainstream professional content creation software has been updated. The environment is different and some of that might be material to people comparing the Studio against competitors from Apple.

The second key piece of hardware in the Surface Studio is the quad-core CPU. It doesn't really preempt a Skylake based iMac, but it should do well compared to most other Skylake laptops. It's got the highest end laptop chip Intel currently makes. The one most PC OEMs skip over when building big notebooks, in favor of either one of the mobile Xeon chips, for workstations computers, or a desktop-class i7 for gaming systems.

The Surface Studio also supports up to 32 gigs of RAM. That's not exactly impressive in comparison to the 64 gigs offered by the Mac Pro and the Skylake iMac, but it will be appropriate for somebody coming from a laptop or an older system. I'm thinking about people using Sandy and Ivy Bridge systems just nearing the end of their first corporate or institutional lives.

The third and perhaps most crucial unique piece of hardware is a few things. The Surface Studio has a 4500 by 3000-pixel display. Microsoft puts a lot of names on it, but it covers the DCI-P3 color spectrum, which is rapidly becoming important for people doing creative content creation and work with media. The display reclines from an upright position to a relatively flat position, said specifically to evoke a drafting table. The 3:2 aspect ratio is said, especially at that size, to be more immersive, and Microsoft spent a fair amount of time talking up the display's ability to show proof perfect 8.5 by 11-inch pages.

To go with the display, Microsoft has the same new Surface Pen (with interchangeable nibs) that came out last year. Adding to this is a new component, the Surface Dial, which works with the surface Studio, Book, and Pro4, as both a control knob and button, in the style of the PowerMate, which was an accessory built a few years ago to help scrolling through video timelines, that was capable of other things. As a PowerMate, it's compatible with any Windows computer that has Bluetooth. The Dial adds an interesting new component for modern Surfaces: you can place it on the screen to make radial menus (such as those in the old Windows 8.1 version of OneNote) appear, and it can be used to navigate them with a non-dominant hand while the dominant hand is used to manipulate things. The specific use cases they outline for this involve photography and illustration. It's used to change colors and brushes on the fly, without having to use the pen, a mouse, or your fingers to navigate a menu structure or toolbars. It is also shown in 3D, and I can see the use for it in video, color grading, effects processing, audio, but not necessarily in, say, page layout or web design.

The Surface Studio's closest competitor is the Wacom Cintiq 27QHD Touch, which for the money has fewer pixels than the Surface Studio, does not get you the Dial specifically, and still requires that you buy a whole computer. The 27QHD is $

Microsoft bundles the pen, and a matching wireless keyboard and mouse with the system, and also points out the specific power cord they include, which is designed to stay plugged in while the system is being shuffled around a table.

I think there is legitimate criticism of this particular system. The first is definitely the price. The pricing will probably make it unattractive to most people looking for a normal all-in-one desktop. The second is that the guts, while good could hypothetically have been better. Microsoft elected to use a mobile processor instead of a desktop one, and a mobile graphics processor instead of a desktop one. Add to that, there is a newer generation of nVidia graphics chip available. We don't have an official word from Microsoft as to why they made that choice. The only thing I can perhaps think is that the Maxwell based GeFORCE 980M is better at CUDA processing for creative and scientific applications than the newer 1080M, which is known to be a far better chip for gaming and VR.

The other one, to repeat a point I made above, is definitely that the machine would have benefitted in both the short and long terms from one or more USB 3.1 Gen 2 Type C or ThunderBolt 3 ports. Faster networking, faster access to big storage arrays, and future upgrades to the GPU (or a second GPU) for improved (or just more) GPGPU compute capacity. Otherwise, I actually think that the port selection is "fine" – I'm not particularly inspired by it, especially because we now have solid state media that can far exceed the speed of 6 gigabit SATA, let alone 5 gigabit USB. Creative professionals are probably some of the people fast storage impacts most.

The other big question, more from the perspective of technical users than

Otherwise, though, I think that the system will do well enough. In a lot of cases, we're talking about people who use a computer and monitor until it wears out, something appreciably better is available, or until the display can no longer be calibrated properly, which back in the days of CRTs and CCFL backlighting, was about every three or four years.

I think that despite the weird timing (deploy Skylake and Maxwell on a new system even though newer generations are available or should be available soon) I think that it's a good system that will meet the needs of its target market. I think that the choice of a mobile CPU, although one of the biggest places where there's room for improvement, is as good a proof as any that CPUs themselves have become less important over the years.

I want a Surface Studio badly I don't think I'm strictly the target market and I think that there are better systems for my needs (or perhaps, cheaper systems that are good enough) but it's just such a beautiful computer that I can't help but imagine one on my desk. I don't actually think it will help encourage me to work on my Lightroom library any more, but I think it would be that much more beautiful when I do open those files.

1 - 10Next
 

 About this blog

 
About this blog
Computer and physical infrastructure, platforms, and ecosystems; reactions to news; and observations from life.