I'm taking a moment away from writing for Camp NaNoWriMo 2017 to talk about the experience of writing this year's novel. It's special primarily because I'm doing it in MacWrite Pro on my Macintosh Quadra 840av, and other vintage Macintosh computers.
I decided to write it on the system in part because on the literal eve of the event, I had gotten my Macintosh Quadra 840av set up in my primary office area with an Apple Multiple Scan 20 display. The thing to know about this setup is that it's beautiful and if you haven't used System 7 or Mac OS 8 at those kinds of resolutions, you're missing out. It's by no means absolutely necessary to use them, but a different world of productivity opens up above 640x480 that I don't see very often. I've used Mac OS 9 at higher resolutions, but mostly in the context of my early days online. I did not end up doing an awful lot of things like
So I start noodling on some ideas and I come across an old thing I'd like to rewrite and expand upon, and I decide to give it a go. For the first day or so, it was just a single MacWrite Pro file on the 840av, but it quickly grew to include a SimpleText file for notes about the plot, and a Resolve file for word count accounting purposes.
It has been particularly interesting to try all of this out. I have "used" MacWrite Pro and Resolve before, but never for actual work. It's been super instructive and to be honest a little hilarious to look at the juxtaposition of what "work" means and what was productive in the early 1990s compared with what we think of as being needed today.
For reference, I have published two successive articles complaining that my Microsoft Surface RT, a computer with a quad-core CPU, two gigabytes of RAM, a beautiful little display, wireless networking and a contemporarily useful version of Microsoft Office, plus a very long battery life, is too slow for me to use as an actual computer and so I need a new laptop. Doing a real writing project on vintage Mac hardware has, as such, been somewhat humbling. MacWrite Pro's default RAM allocation on both 68k and PowerPC processors is one megabyte. It will run, probably happily, in as little as 640 kilobytes if it must. Claris Resolve's minimum is 750, and its preferred is the same 1024. SimpleText's minimum is 192 and its preferred is 512.
So here I am working with a document that's about to hit 10,000 words (I'm behind, whoops) on a computer with 24 megabytes of memory. The OS is using a bit under 7, MacWrite Pro and Resolve each have 1 megabyte allocated, and of all things, the SSH client I have on here is using 6 megs.
The actual workflow is interesting. This is probably going to sound familiar, because a lot of people still do this particular thing, but with USB flash drives. I have saved my primary copies of all the documents on a floppy diskette which I've named specifically for the project. Every day or so, I make a copy onto a hard disk of one of the machines as a backup. It's very low-tech, but it easily enables me to do this work on any of the computers I set up, even if they aren't networked. And, I have a few different systems I'm using for the work so far. The first is of course the 840av. Then, I have a PowerBook 1400c/166. In my bedroom, I've put a 6100/66 with a 14-inch Macintosh Color Display, and at work, I have a PowerBook 180.
I also have a Macintosh LC520 I could set up somewhere, which would probably be the truest test of this workflow. That system has a generous five megabytes of memory installed. However, it's also using a much older version of the OS than the other machines are, so it should happily run Resolve and MacWrite Pro. The only real limitation there is the fact that I don't have enough keyboard cables and mice to set up that system.
The obvious weaknesses of this system (all of these systems, har har) is that I do not have good backups in place to warrant against failure. I have a large data cartridge drive on the 840av, but not enough media to trust it or to use it if the 840av itself were to get full. I have networking, but I don't have an appropriate file server set up at the moment.
The other concern is in workflow. I don't think I could write my normal daily blogs on this system, because transferring data is a big challenge. Regular NaNoWriMo is also a much higher pressure event and there is a lot more unpredictable travel involved. The PowerBooks 180 and 1400 are portable after a fashion, but neither of them gets good battery life, and although it's not strictly a problem right now, they are both over twenty years old (21 and 24) and certain bits of plastic have weakened over time.
The other thing that I find it important to mention is that a lot of people talk about old computers in terms of their ability to create a distraction-free writing environment. Unfortunately, those people then go and load a bunch of old games and network communication software on the machines. I've long known I would never be any more productive on, say, one of my 68k Macs as I am on one of my normal modern computers. I'm doing this not because I think it will somehow enhance my output or my capacity to write more things. I think that's dictated more by being able to move away from mainstream spaces. The offset created by a 20-inch CRT does actually help, but I can just put a laptop nearby or pull the LCD on my Mac mini forward and suddenly I once again have two computers nearby to deal with. I'm personally convinced that the key to my own productivity revolves more around only having one computer on hand and having the computer be comfortable to use and have tools I like on it, than trying to force specific limits based on having things installed or not installed on the machine. When it comes to my blogging workflow, things that could otherwise force "distraction free" are a huge impediment to productivity. For example, the real problem with the Surface RT is that browsing the web on it is slow. I could do it and gather links, but it would be faster to use some other computer.
Once I'm done with the month, I'll probably re-evaluate the Surface RT. It feels slow in relation to other modern computers, but its advantages in comparison to old Macs are numerous and noteworthy. The disadvantages it has can either be worked around or just offset.
One of the most challenging aspects of my recent hunt for a new portable computer is I've yet to decide how to decide what machine to buy. Perhaps even worse, I don't even know what type of computer to buy. I've got a few different ideas for the machine itself, but as with previous computers I've bought, I want it to go through a sort of vetting process.
It's hard to decide between whether to buy based on a set of tasks I'd like to accomplish, or based on a set of physical properties I would like the machine to have.
Back in 2009-2009, I set up a wiki page called "Computer Upgrade 3" and set a few machines I liked up on a table to see what properties were there to compare the machines on. I ended up comparing available display technologies, processors, graphics processors, certain expansion/flexibility criteria, and a few other things. Perhaps the easiest part about that process was that I had a relatively clear idea of what I wanted, even if I didn't write the priorities directly on the page. I was looking for my only computer, and I wanted something with a reasonable combination of durability, being smaller than my old laptop, and ideally being faster than what I had before.
Today, the machines I might look at are so different I feel like there may need to be some other stuff I look at before I look at machines. Plus, changes in the overall environment (such as general ability to trust that Lenovo will build a reasonable machine) mean that I'm looking in some places I wasn't before. I can pick out a machine and run benchmarks on it in a store or read reviews, but it's difficult to put different machines in context and evaluate them because I don't know exactly what I'm searching for.
The trouble, and perhaps I should just start laying down comparisons, is that I'm really tempted to just put something like a 3:2 display at the top of the requirements list, which forces a very specific set of machines to float to the top.
On the other hand, there's the question of how necessary that display is. I can obviously complete work without it, even photographic stuff, but it would be helpful to know what stuff I'm going to be doing with the machine. The needs on a machine whose life will involve Office and a web browser and not much else are different from something I expect to play games on or use for photo and video processing.
Although I love the idea of being able to whip out my powerful laptop and render video wherever I happen to be, high definition video is near trivial to edit and render these days, I bet the Surface 3 could do it if I bothered installing some attendant software on it. The real trouble is that those capabilities make a machine bigger and shorten the battery life. I also don't strictly need that functionality. I have powerful desktop computers at home on which I can easily edit video, run virtual machines, do programming tasks, etc.
This makes me lean toward getting something optimized for Office use on the go. If it can run remote desktop, my SSH client (for chatting), Office, and a web browser, then it should have no problem with any of my on-the-go needs. My actual needs are minimal enough that the Surface 3 generally handles them very well, the only real problem being of course that the Surface 3 appears to be on its last legs.
Almost certainly the thing making the process more difficult is this notion that my laptop should somehow be my main computer, and that it should also be purchased with an eye toward keeping and using it a very long time – over five years.
My reality is almost certainly that barring some kind of physical problem with it, my next computer will be kept five years or longer (unless I elect to go for something insanely inexpensive, or to use an iPad) but if industry trends of the past few years hold, I don't think even relatively low end computers today will have trouble with Office and web apps in the next several years.
I bought my ThinkPad T400 eight years ago, to replace an iMac and an R61. Almost 4.5 years ago I bought my Surface RT, and I got my Surface 3 probably about two or two and a half years ago. This year has been a bad year for hardware. First, my T400's display broke when it fell to the ground. Second, my Surface 3 has sustained several falls such that there's a huge crack in the display, the touchscreen now doesn't work, and at around 20-30% battery capacity, it randomly just turns off.
The Surface RT (and side-project, Yoga RT) are working and get good battery life, but the main reason I stopped using the RT was because it's just too slow. It barely loads SharePoint 2010 pages, and those haven't changed in years. I would be able to write with it, but I would not be able to do my typical blog writing workflow on it, which consists of a lot of research. I also rely on a few pieces of software that need an x86 Windows computer. Chiefly, an SSH client for Windows RT exists but works very poorly.
At least one of my colleagues scoffed loudly when I mentioned I need a new laptop. I have a few newer laptops, particularly from the Sandy Bridge era, but those laptops aren't particularly good. At least one is a downright bad experience. The other is a better experience (despite having extremely similar specifications) but it's a physically huge machine that doesn't fit in any bags I own. Neither of them gets particularly impressive battery life, either. I would need to dump some money into that, whether it was via an outboard power supply, a new internal battery, or both.
Given that it's tax season, and a young, uninformed version of myself looking to make things easier changed my tax withholding rate, I'm about to get a relatively healthy cash infusion. It's enough to make a dent in my debt, buy a better backup system (including a USB 3.0 card) for TECT, or perhaps some combination of those things. The trouble is it's not enough to go quite as insane as I did in 2009. That particular level of insanity would involve a high end Dell Latitude with five years of a good warranty option, a huge solid state disk and a lot of memory, or something like a Surface Book.
It's tempting to question whether I really need a "laptop" at all and start looking at new iPads. At the core, it should do everything I want: edit Word/Excel documents stored on my local SharePoint server. View web sites, read and reply to e-mail, listen to music and video videos, last a long time on battery, and the newest iPads should be able to quickly switch between these tasks. The real trouble is that the most obvious choice for this is the 12.9-inch iPad Pro, which with a keyboard equipped and a generous allocation of storage, costs more than a similarly equipped MacBook Air or Dell XPS, even more than a Surface Pro 4 in some cases, and will end up suffering many of the same problems a Surface Pro would. Plus, an iPad and a good keyboard (either Apple's own or the Razer one) will be at least as big as most good laptops, meaning the only real advantages are in either iPad-unique software, or in the simplicity or physical durability of an iPad. My iPad 3 was with my Surface 3 for most of its falls and although there's a pretty big dent on the case, the display has not suffered.
Something I'd love to look at more is the new ThinkPad lineup. I realize they are thoroughly over, but the X270, T470/P and L470 look like pretty exciting machines. They bring ThunderBolt 3 to the table, have great battery life, and
I realize that ThinkPads are thoroughly over, but I also really want to take a look at the upcoming ThinkPads X270, T470, and E/L470. There's a certain nostalgia to the idea of getting a mid-sized ThinkPad again. The T400 was one of my best computers ever and it is often a little easier to use something that has a little heft on the lap, certainly in comparison to Surfaces.
I think some of the real questions are in what is going to be done with a new laptop. Is it going to be a daily driver at home? Will it be docked to a keyboard, monitor, and mouse at home? Will it complement existing laptops that will continue hanging around as endpoints in different places I go, or will it be the computer I pull out at every opportunity?
I have some blog posts in the works, but I keep being unable to edit them. It's been difficult because I've been working on some medical stuff. It can quickly get annoying how difficult it is to work on medical things, even if they aren't your direct problem, and instead you are working on them in service of helping with your main problem.
On a whim, at the beginning of February, I went into my vascular surgeon's office to follow up on a failed access that we'll call #4. It was that surgeon's first attempt, and it was my first graft, after three failed fistulas.
His idea was to create a new graft, #5, slightly further up my arm. The surgery happened a week later, on my birthdate, and was a success. Then three days later at plasma exchange, it closed off.
So the literal next week, I was in having #6 created. That went well, but it also closed off after just a few days. The trouble with that one, which was a more advanced type of graft that combines a regular graft with a catheter so that outflow happens in a place it's less likely to get clotted, in a larger vein.
The closure of the HeRO graft was more traumatic than is strictly ideal. I think at this point a lot of the blood for that area is coming from my arm, and when the HeRO clotted, what happened was it and the other catheter came loose from the vein they're inserted into and if I move into the wrong position, it felt like I couldn't move blood.
After getting another procedure to de-clot the HeRO once more, things got better. I received plasma with no trouble. I eventually followed up with the surgeon and they approved it for use in the next few weeks.
That next few weeks was this past weekend. We used it for the return line during plasma exchange, and the nurses and nephrologists who manage my plasma exchange treatments are pleased with the results. There are a few aspects to the graft I didn't expect, including that they run it close to the maximum speed of the machine, and that it bleeds for several minutes after the run is done.
In retrospect, these things shouldn't be surprising to me. You've poked a hole in the plastic tube carrying a large amount of blood very quickly, of course it's going to spew a little.
For a few weeks, I was on leave from work and even when I returned, my energy to do an awful lot of stuff after work has been a lot less of late. I'm sure it'll recover naturally over the next few weeks, but I was not up to writing when I was between surgeries and hanging out recovering.
I'm hoping on returning to my normal posting schedule. I'll post this and then I'll work on some things I have in my drafts folder, and then I'll work on some new takes probably on some of those same issues.
One of the biggest hardware, space, and workflow challenges I've had over the past several years has been how to focus on writing. I am not generally a believer in "distraction-free writing environments" especially when talking about using vintage computers that way. I am, however, a believer in other things that encourage good writing habits.
I'm a strong believer in the idea that there are things you can do to encourage you to write, but that you won't write unless you're ready. I've found space to be a good helper, but I've never figured out exactly what influences my productivity while writing.
I have never had trouble writing on any given computer. I don't need a computer to be set up in such a way as to be distraction free, nor do I need to do things like set up specific vintage computers to encourage writing. (Not to mention the perhaps somewhat obvious workflow limitations of using a vintage computer for writing, at least the writing I do.)
I have a theory that it's not the computer or even the place, but the way that place is set up. With a recent change to my living arrangement (short version: I have a housemate now) I'm forced to come up with creative places for things I use. For the first time since I've moved into the house, I have a computer in a formal location in my bedroom. Right now, it's just a little table with a chair at it, and the laptop floats back and forth between my recliner chair and the table, but I do find that writing at this little table is a bit more productive.
It's probably just because it's easier to force "distraction-free" if I only need to do it to one computer at a time. Maybe I'll see if I can use this setup to do that photo tagging I've been meaning to do for several years now. I wonder what it would look like if I set the desktop up in that space. I seem to have a lot more luck with writing and other productivity on laptop form factors. I don't know if it's just because I'm used to it, or if it's because my desktops almost always have a lot of other stuff nearby and it's hard to get into a particular flow when I have several computers competing for my attention.
It works well to have a bunch of computers at work, if only because the very nature of my work is that I am switching between tasks so fast that it doesn't make a difference if they're on one system or the next. The core of that work is being contacted for assistance. If I had a job where I had to focus on a single task or project, I'd probably reduce to a single machine, maybe even switch to an all-in-one and a single display.
I have a feeling that with a little more reason to do it and a few more tables overall, I would be that person whose desktop is painstakingly cleaned and has one computer set up in such a way that the space still looks minimalistic, either the tower would be hiding elsewhere or it would be an all-in-one computer.
And perhaps that's the secret I keep missing. I have had some luck at my tall desk which just has the fast PC at it, but maybe I should set up a similar sitting desk, or get a better chair for the tall desk.
One of the places I'm most productive is away from my home. I think there's a magical combination of only having one screen available, being away from certain resources such as your fast home Internet, your media library, and so on, and the feeling that your time spent in a public space such as a café or a restaurant must be spent well. I've also had good luck writing when I'm on battery, although I don't think that's a specific requirement.
It usually helps to have an idea of what I'm going to write. Whether it's a vague idea, a draft I am going to rewrite, or a specific outline.
I even find workflows where I need to do multi-tasking, such as bouncing ideas off an IRC channel, using a web browser for research and links, a text editor or OneNote for holding onto temporary bits of text, and then my word processor or other writing program for the actual text to work better when I'm on the go.
Much of this applies to tasks other than writing, the biggest problem is that writing is one of the easiest things to do on the go. Doing more intensive tasks on the go either requires that I can reliably work at home or that I have a bigger, faster mobile computer that holds more data and also reliably runs for a long time on its battery. More to come on that issue though.
I recently participated in a discussion where somebody essentially presumed somebody else was "stupid" because they (as a UNIX system administrator) weren't aware of a long-deprecated feature in a version of Windows that came out fifteen years ago.
There was a pretty lengthy discussion about the fact that stupid and ill-informed are different things, and that (especially for a professional UNIX admin) not knowing about a feature that was deprecated ten years ago seems irrelevant. I won't harp on it too much here, but suffice it to say that people with ideas like these grate on my sensibilities because it tends to be a symptom of a larger pattern of some negative language that is often both inaccurate and in extremely poor spirits.
The technical component was interesting though. This person has an old server (using SCSI disks) with probably about 700 gigabytes of usable space. This is a common configuration for a server in the late Windows 2000 and early Windows 2003 era. The person has an external SCSI card and a tape library with some number of LTO-3 drives. Some amount of the tapes installed are dedicated to backup, and the remainder are dedicated to a cold storage tier. There are 22 terabytes of addressable cold tier space in this person's system.
For those unfamiliar with the idea, Windows Remote Storage Manager (2003) allows the extension of capacity and the better utilization of resources by placing infrequently used files on tapes, which have traditionally been cheaper to scale than hard disks. Today's version of this is basically to use a few solid state disks for the most active part of your dataset, and then start scaling down to different types of disks. You might use small 10k disks for the next tier. Then 4TB/7200RPM disks for the next tier, and then 8TB SMR 5400RPM disks for the last tier. Today, tapes basically don't exist in active tiered data storage situations, because they're just too slow, and because expectations are different.
If you control the environment and you tell the accounting department that files they haven't touched in a year are going to take longer to load, then you know what to tell them when they call when they're trying to open documents from two years ago. You can't exactly put the oldest bits of somebody's Dropbox account or last year's Facebook pictures on tapes. Even in internal institutional environments, expectations are such that there's an expectation that years-old data will be easily at hand. I don't think that expectation is unreasonable given how much less expensive server hardware and bulk storage is today than it was even just a few years ago.
There were the normal lamentations about why Microsoft removed it from the system, but to me it seems pretty simple. Remote Storage Manager is a vestigial component of Windows that just didn't fit with the strategy or the needs of real customers, even in 2008. In 2008, SATA and relatively big disks and the ease of expanding SAS controllers meant that buying shelves of disks with monumental capacities was cheaper than dedicating another tape library (or a portion of your primary one) to a cold tier.
Although it has existed, I'm guessing the big difference is that for high availability reasons, Microsoft wanted to push customers to Distributed File System, which allows for much better high availability configurations, and makes more sense today than it would have in the '90s or between 2000 and 2003 anyway, because servers and disks use a lot less energy per terabyte than they did all those years ago.
As I mentioned, the person we were talking with has about 22TB worth of tapes, and 700GB worth of hot storage area. I don't know what their habits are generally like, but I know I need a lot more space than that to be active. I have a lot of data that should probably be sent to a robust-but-inexpensive cold storage location, but I'm not going to do that by adding a big tape library to my server.
It got me thinking… this person was very proud of their 22TB. I think they think they're in some kind of big leagues. The availability of inexpensive slow external and internal disks (things like 8TB SMR disks from Seagate) prompted a vehement reaction. You should never use USB storage. Important files and any amount of large amount of data should always be on a server!
It's an interesting and ultimately unreasonable expectation. There are many datasets that need to be stored locally. Lightroom libraries, as one example, can not be stored on a server. You can use synchronization tools to store that data on multiple desktops, however.
Back to servers though… storing large amounts of data is less expensive than it has ever been. 8TB disks, SMR and PMR, are inexpensive enough, you could put five of them in a Drobo 5C and have 24 or 32TB of fault tolerant storage on a server or desktop. You can put six 3.5-inch disks in most entry level servers today, so you're talking about 32 or 40TB of fault tolerant storage. (Side note: I'm considering a USB 3.0/3.1 card for TECT and a Drobo 5N or 5C as a backup system. Perhaps as a sort of "disk to disk to other disk" setup or as the tier you see before moving things to tapes or RDX cartridges manually.
I'm sure that in its day there was a lot of success and cost savings to be had by using a tape storage tier. Today, I just don't think it's reasonable. You can buy extremely dense disk enclosures from normal PC vendors these days, and for any enclosure attached to a controller that supports 4k sectors, you can easily swap out old disks for new ones to increase density and capacity on existing systems. You can often switch out disks one at a time and then grow into the new capacity using whatever software or RAID controller you're using.
In the light of the fact that it's possible to equip a single disk enclosure that will run over 600 terabytes of disks, I'm not sure it's necessary for storage tiering products using tape to exist. With disks using less energy than ever, density going up, disks costing less than tape below a few hundred terabytes, and with the ability to spin down disks, it doesn't make an awful lot of sense to bother with tapes. Especially since at "web scale" you need several copies of a piece of data anyway, meaning that tiered storage with tapes is going to need to work that much harder to work at the expected level.
I think it's neat that somebody has done it and documented it, but there are few enough situations where it's relevant that it probably didn't make sense to keep in the software. Using deduplication to reduce the overall load on disks in a server and using true archiving solutions to offload old data probably makes more sense than using tape storage in this context. I think that tape archiving and tape backups are good ideas, but tape file server doesn't make as much sense.
The most interesting update in the world of vintage Macintosh is probably the new Tiny SCSI Emulator. It sort of challenges the SCSI2SD as the performance crown for vintage Macs. It is also easier to hand-assemble, and it promises future functionality such as the ability to use SCSI for Ethernet and video output connections, which was functionality provided by real SCSI peripherals that have existed.
Naturally, there's a discussion going on about this, but much of the basis of the discussion appears to be based on how smug people can be. The original post was pretty bad even for Ye Olde Computer Forum, basically decrying the use of the SCSI2SD as being unmanly in a world where you, for some reason should be using old mechanical hard disks with SCSI based computers. Never mind the fact that it is essentially impossible to rebuild or meaningfully refresh mechanical hard disks, and that as the oldest 68k Macs are now over than 20 years old, finding machines with working disks is getting increasingly difficult.
A few years ago, a trend was to adapt fast, modern server hard disks for use in 68k Macs. The real problems with this are that the supply of those disks isn't unlimited, and that the Mac community isn't the only one searching for a modern type of disk or better storage solution. Various UNIX computers use those disks too, and those systems generally have a better capability to cool and use disks like this (and similarly, are very poorly served by traditionally low performance solutions such as the SCSI2SD.) Even pretty bad SCSI systems like the late Netburst based Xeon workstations and servers can use a lot more disk performance than an SCSI2SD—At least as it stands today, it is likely to get better, and the SCSI2SD v6 is said to be much better.
The other component of the discussion was essentially that there can be no vintage Mac without physical repair skills. It's an unfortunate discussion to have to have, because although I know a lot about Macs and I've got a lot of context on the history of the platform and the history of the community. Back in the day, when most of these computers were entering their second lives, my use for them was as my main computer. Today, as all of them are over 20 years old, my main use is pretty different. I'm doing fewer "modern" Internet applications and I'm using more pro applications and doing more things I wanted to get into when I had my older Macs, but couldn't for different things.
There are different reasons to use these computers, and not everybody is at the same skill level, both with Macs and with physical electronics tasks such as soldering or doing board level diagnosis as far as what particular chip it was that failed in a situation with particular symptoms. The problems with this are that it's exceedingly exclusionary and that it suggests that people shouldn't be able to enter the hobby without some way to gain skills they may not be able to get without first joining the hobby. The other problem is it excludes people for whom learning those skills doesn't make sense. I think there's a presumption among people that everybody will find soldering to be fun, because they do. I don't want to exaggerate here, but I just don't think this is true. If the knowledge and rate of learning of the people who show up about basic concepts of the platform is any indication, I'm not expecting many of the literal children to be able to pick up soldering very quickly, plus they're literally children so there's often the of being able to get supplies. At the other end of the spectrum is people who just don't have the time, plus other reasons why someone might not be able to do a bunch of soldering.
The trouble is that it's hard to deny that soldering is a valuable skill, and so because it's valuable, people extend that to mean that it's somehow mandatory. I'm sure everybody does this to a certain extent. It's a side-effect of the fact that we're all around for different reasons, and there is a correlation between people who place a high importance on soldering skills and people who happen to be good at it, either because they are more interested in electronics than computers or because they have enough vintage computing hobbies that they've entered one where it's inescapable.
But that doesn't help those of us with homes small enough we can't set up a soldering station, or who just never have the time or energy to do it, or who don't have the physical dexterity for it, and so on.
The SCSI2SD sort of ends up being a proxy for the conceit that everybody should have soldering skills. You can't hand-solder a SCSI2SD yourself, because it is assembled using a methodology that requires special skills, and buying an SCSI2SD requires the admission that existing storage technology and the availability and complexity of disks and adapters meant it was more worthwhile to replace it entirely with a device that accepts SD cards. I can see why it might not be ideal to admit that you've reached the end of a traditional solution. The other comment about the SCSI2SD was that it's expensive. It's more than $0 but it's not as expensive as many people think. The price has been coming down steadily over the years.
The end of the year has been unkind to Apple. Probably because Apple has done a lot of things this year interpreted as slights by user communities. Even among technical crowds and the tech press, often generally supportive of what Apple does.
Apple's main accomplishments of the year are, in no particular order:
- New WatchOS and Watch hardware
- Making iPhone users angry by removing the headphone jack
- Introducing a true successor to the iPhone 5S and dramatically mis-estimating the demand
- Introducing the 9.7-inch iPad Pro hot on the tail of the 12.9-inch version
- Introducing questions as to whether the 12.9-inch iPad Pro had been delayed, and creating a situation where neither is a clear leader
- Because the 12.9-inch iPad Pro has more RAM, but the 9.7 inch version has a better display by far
- Leaving any discussion of updated Macs until the very end of the year, at which point new MacBook Pros with controversial new connectors
- Failed to say literally anything about the entire rest of the Mac lineup
Apple has generally been unkind to the Mac over the past several years. It has been worse this year. Some of this has been a relatively natural progression of Apple's relentless search for sleeker computers and better integration. Some of it has been questioned over the years of course. The best examples I can think of are the removal of the optical disk drive in the iMac and the new change to USB Type C ports.
I'm usually fine with Apple's changes, but a lot of them feel like they're being done somewhat capriciously by a company that has had a little too much success predicting the future. I've argued before that some of the changes Apple has pushed over the years are damaging to users. Floppy diskettes were fine, I don't specifically mourn the loss of the optical drives, but it's starting to feel like Apple's doing things specifically to spite its users.
I think there's a few problems with it. I don't think Apple is truly contemptuous of its users, despite all appearances to that idea. I think the problem is that Apple massively mis-judged excitement for the new Touch Bar feature (as with Force Touch on both iOS and Macs) and in a presentation about a machine whose base price ranges from $1799 to over $4000, it seems tone deaf to spend literally any time talking about the machine's ability to do predictive text input (on a keyboard where people can easily train themselves to type at over 100 words per minute) and easily select from different emoji.
USB Type C is absolutely the future and annoying though it may be, I don't think Apple is wrong to jump all in on it. I do think that the pricing of the machines is wrong, and that Apple was probably wrong to leave the entire 2015 MacBook Pro lineup in place at their existing prices. (Perhaps to drive the point home, Apple should have discontinued the 2015 MacBook Pros and dropped the price on the 2012 MacBook Pro, the MD101LL/A, that was still on sale until the keynote.)
A lot of these things aren't inherently bad, but drama surrounding the MacBook Pro, combined with continued poor messaging has caused additional drama. On top of all of this, Apple has been almost utterly silent about the rest of the entire Mac platform.
The silence has probably been the worst. It is mostly well understood that Apple is, in a lot of ways, at the mercy of its suppliers for things like new generations of processor and graphics card. This is especially relevant in the Mac Pro, which is using 2012's finest Ivy Bridge-EP processors and also probably 2012 or 2013's finest AMD GPUs, Apple skipped Haswell and Broadwell for no discernable reason, so the hope is that now that AMD has Vega GPUs, Apple will update the Mac Pro to use those and the Skylake-EP CPUs due out a little later in the year.
Old and a little slow though it may be, the Mac Pro is still considered to the most capable Mac. It holds the most RAM, has the most ports, and it has that dual GPU configuration, if you use that kind of thing. It's a very good computer, but it only truly effectively serves one of Apple's many constituencies.
Chuq von Rospatch suggests of the Mac Pro and a potentially re-balanced MacBook Pro that prioritizes performance and capacity over thinness and battery life that it should be considered as a strategic machine. It should be the ultra-high end machine that makes video editors and developers of varying kinds happy to be able to buy. The argument here is that these are the people who influence the opinion of others and help them choose Macs. Without those folks, there are fewer people running around specifically evangelizing the Mac.
Technical users (and even people who power-use specific apps, but may not power-use the whole system) are often implicitly tech support for others around them, and in addition to influencing opinions, they influence recommendations for new technology purchases because it's easier to support the thing they use.
I have personally been disillusioned with Apple for a long time, although they are making extremely compelling hardware and their software (what little they still make, but Mac OS X specifically) is now better than it has been for several years. I sometimes find myself wondering exactly how many Macs I've encouraged or dissuaded over the years. I normally tell people that Windows computers are perfectly fine and if they already know Windows there's no specific need to get a Mac. Would I have said those things if I'd been using Macs?
To make things worse, Apple now has messaging trouble with the iPad. The 9.7-inch iPad Pro shipped only a few months after the 12.9-inch model, with a tremendously upgraded display, but with only ("only") 2 gigs of RAM. There continues to be consternation about whether or not the iPad, any model, deserves the "pro" tag. This is mostly the same crowd saying that Apple should stop selling the MacBook Pro under the "Pro" label. I'm mixed. It's clear that "Pro" has meant "nicer" for a while now. Apple advertises the iPad Pro hardware as being capable of a lot of very intense computing tasks. 4k video editing and photography chief among them, especially with the newly improved displays and more memory.
The trouble is that Apple has done a poor job shepherding the iOS ecosystem into something that can truly do this work. Perhaps Apple has something up their sleeves, but they've been saying this for years, and I've been writing about it for years, and as far as I can tell, network connectivity will never really catch up with demands on using iCloud/OneDrive/GoogleDrive storage for large data files.
An attentive version of Apple would realize that workflow and multi-tasking are important and build machines that cater to the needs of certain customers. It doesn't even need to be a big 2-socket, slots and disks workstation (although there are many who would buy that product, even if they did not strictly "need" it) – Apple just needs to say something about the Mac.
Almost more importantly than opinion-makers is the fact that a system like the Mac Pro is needed for the best experience when doing Mac and iOS development work. I don't know how Apple builds Mac OS X, but the logical choice would be to use some kind of Mac for the job. The Mac Pro makes the most sense, and if that's what Apple is doing, it would make sense that Apple has motivation to update the machine.
Granted, it could very well be that Apple is using servers from some other vendor to compile Mac OS X components, thus side-stepping the very issue that Mac users have, which is that there is nothing better than the Mac Pro for a lot of work, and building a non-Apple Mac "works" but is not at all a solution for people outside of Apple doing similar work, or different but just as demanding work.
The more I think about it, the more I think that there may be some room to re-balance what the Macintosh platform looks like. I think that there's room to re-position both the Mac mini and the Mac Pro upward. Apple could (maybe even "should') build a Mac mini that is a little bigger (perhaps even using desktop bits) and a new Mac Pro to sit above the current one with a little more configuration flexibility.
Another option may be to build both the Mac mini and the "Mac" on the same board platform, with the "Mac" coming in a taller enclosure that allows a better CPU and a discrete graphics card (basically 27-inch iMac specs) and replace the Mac Pro with a bigger, more flexible machine entirely.
The first step however is doing literally anything to acknowledge that the Mac is an important product, and announce some kind of plan to do literally anything with the veritable pile of 3-or-more year old products still being sold today. The MD101LL/A was a great joke and at a lower price point could have continued to make sense, but the 2013 Mac Pro is just sad.
I originally set out several weeks ago to write a post about my long-standing desire to be a video content creator. I was going to write about how creative I'd been as a child and how much video crap I did any time I had access to such a camera. Then, I realized I was talking about prioritization, although indirectly.
I made plans to shift the post and create a video blog to talk both about that post and to possibly talk about my desire to post regular videos of some sort to YouTube, whether they're tech talks or about health or just a general video log of day to day happenings, I wasn't sure yet.
After, perhaps ironically, failing to write anything at all (for the good reason that I caught another cold), it occurred to me I should just rewrite (I had one and a half sentences done) with the starting idea that we all prioritize different things, and there are different factors leading to prioritization.
The background I was going to explain and work through a little bit in the previous post was that as a child, I found the process of producing video content incredibly interesting. I didn't think I would go work in TV, but I felt like I had a lot of stories to tell, and I frequently ran up against the limitations of both technology, and what I had. A friend and I made several short videos on a computer I had using Windows Movie Maker, a webcam, and a USB extension cord, and we had the better part of a longer story with more varied settings recorded on a DV tape, but the tape would get rewritten and it would be years before I had access to both the camera and a computer I could capture and edit with.
Once I got that, I was doing some more things, although the limitations of storage meant I couldn't do much. Later, I had limited access to video equipment for a while. Most recently, I have ample video, storage, and compute resources, but what seems like fewer ideas and less time.
I've been wanting for a few years to get back to video stuff, but the thing I don't have today is my attention. I have a few videos on YouTube, and I take clips on my phone and my camera from time to time, but even though I've wanted to for years, I've never had the wherewithal to keep up a video production workflow or a particular series of things for a while.
I've tried a few different things – video games, server administration stuff, personal blogs, one train video I managed to capture, among other things – and none of it has worked.
I think what it comes down to is prioritization. Every time I start to work on video stuff, it does "work" but I am not able to keep up with other thing I do, whether that's playing games, meeting up with people, writing, or just having mental downtime. Certain types of videos, I'm both mentally and physically unable to prioritize. I couldn't go make a bunch of train videos because those are quite physically intensive, and while I'm in a good place, I'm not in that good a place.
The other thing I worry about is my ability to create short fictional videos or periodic personal blog content. The world doesn't really need another "tech" channel or podcast, per se. I would want to find something unique, which might be easy to do in concept, but difficult to do in practice. Although it would be technically easy, I don't particularly want to be a talking head channel, certainly not on a technical issue.
But all these are excuses, and I think at the core of it, I know I could find something if I wanted to and put some time into it, and if I did, I could produce something somewhat reasonable, and get better at the craft. I could create a conservative posting schedule or even make things varied so I don't get bored.
I was originally going to make a video to pair with this post to make the announcement of regularly posted video content, but it never happened and to make it worse, this post was delayed by a week due to my usual winter troubles, minor illnesses made worse by the continued existence of my chronic illness.
We'll see if I end up coming up with anything. I was pleased with myself this past year for avoiding long and arduous hospital stays and with relatively few exceptions I made a blog post every week. It's down from where I was a few years ago when I was writing about music, but it has been good nevertheless.
Perhaps I will never get into video production, and my YouTube channel will remain as it has always been – a vessel for random things I see or do, for the times when something particularly good comes up. one of this will stop me from eventually spilling a few thousand words about the workflow challenges contributing to keeping it from happening, though.
Last week, I wrote about the Type C connector for USB, ThunderBolt, and other things being encouraging. The tech news media has been publishing article after article about how it's doom and gloom and about the fact that the market for peripherals is currently relatively nascent, and the standards are still evolving.
Near the end of that piece, I compared it to a previous big transition that Apple kick-started and forced: the removal of 3.5-inch floppy diskettes from computers. We've been discussing the issue in #68kMLA on irc.oshaberi.ne.jp on and off for a few weeks and I think it's worth writing something about it here.
Before their removal from computers, floppy diskette drives were essentially the baseline storage media. 3.5-inch, 1.44 megabyte drives had been common since the late 1980s on Macintosh as well as x86 "PC" platforms, and were available as upgrades to many other types of systems, although often at great expense. Early on, 3.5-inch diskettes were a huge improvement on the 5.25-inch media they replaced, and for years, they were considered enough. Until the mid-late 1990s, most (if not all) software (outside of CD-ROM specific multimedia titles) was available on them, and they were the most common way to transfer data or make backups of a computer.
I believe the removal of the floppy diskette (and in general, the market's failure to develop and implement a viable replacement before then) was much more damaging to computer users (financially and operationally) than the change to the USB Type C connector has been.
There are a lot of reasons for the failure to develop a new standard. I think a lot of it is basically a failure to recognize the floppy connector could or should be a generic thing. Serial connections on Macs were almost never used for storage, and although PC Parallel ports were used for various types of storage, it was nearly never particularly fast. Even if the controller could be adapted for use with newer drives, no newer drives were ever developed for Apple's floppy controller. (Notably, in the 1980s, there was a 20-megabyte external hard disk available for the port, but it only worked on certain early Macs.)
On the PC side of things, lack of a more universal and faster connector (HP-IB, aka IEE-488 has been suggested, because it was fast-ish, cheap-ish, and allowed for multiple devices to be connected) compelling for storage. SCSI wasn't common on PCs, mainly because it was difficult to set up and because it was expensive.
Several vendors floated ideas for replacements: IBM was playing with 2.88 megabyte drives and diskettes on its own PS/2 and ThinkPad computers, and Sony developed the MiniDisc and MD Data (140MB) format in part to be used as a data drive, and floptical technology allowed for 21 megabyte cartridges in a similar footprint to 3.5-inch diskettes. Later in the '90s, there were other formats such as SuperDisk/LS-120, Zip 100-250-750, but I don't have information on which of these were intended to try to supersede the floppy disk as a standard, and how many were available as an expansion system, the way that EZ-135 and bigger systems such as Bernoulli, Jaz, Orb, and SyJet were.
For a superfloppy format to become standard, PC OEMs and Apple would need to bundle the drives (at least) with systems, ship software on it, and it would need to be or quickly become very inexpensive, and also easy to add to existing computers. Unfortunately, none of these systems ever met those criteria. Zip 100 was probably the closest to do. Unlike the Ditto, Iomega never built a version for the PC floppy connector, but parallel port and IDE/ATAPI versions allowed inexpensive options for PC, internal and external versions for Macs were common, Iomega was among the first to release USB storage devices with the Zip 100, and by the end of the '90s, Apple was in the regular habit of making several of its computers available with internal Zip drives. PowerBook accessory manufacturers also built versions of the drive to be installed into Apple's laptops.
We had a Zip 100 drive attached our PC at home, and when dad got tired of the parallel port version, he installed an ATAPI model and I used the parallel port version with whatever PC equipment I had around, and I had one disk. Later, I inherited a USB Zip250 drive and a Mac I had inherited the old IDE/ATAPI Zip100 drive, making it practical for me to use them. (Almost annoyingly, though, because by then I had an Ethernet router and three or four computers connecting to one-another with file sharing anyway, plus a DVD burner on the PowerBook I had.)
Rewinding a little, though…
My own reaction to the removal of the floppy diskette drive was delayed, because I didn't have computers without them. Even with floppy drives available though, my software ecosystem was relative weak, and I didn't have very many diskettes or a good way to organize them, physically.
One of my strongest memories from this era was when I asked a friend of mine who had access to more Internet connectivity to download an MP3 file for me (1). The file was something like 3.4 megabytes, and needed to be split across 3 or 4 floppy diskettes. He used PKzip on a Windows computer to compress and split the file, and then handed me an envelope with the attendant diskettes in it. I had to hold on to those diskettes for like a month while I frantically searched for a program to open that data on a Mac.
I eventually got the file open and could listen to the song. I didn't think about it in this way at the time, but the limitations of floppy diskettes had been holding me back for some time. Other ecosystems for large data transfers and backups had existed for a very long time, but most were very expensive, many were discontinued by the time I was using computers, and things like external SCSI hard disks, Zip100 mechanisms and even SyQuest and Bernoulli media were uncommon at garage sales and in used computer retailers at the time. I'm convinced most people were keeping these peripherals and moving them from system to system at the time.
I could use floppy diskettes to transfer data, but I didn't have the context or a good way to find out the paid version of Stuffit was one of the best ways to put a large-ish amount of data on floppy diskettes. I moved some data from one machine to the next, but not a whole lot.
The situation got worse when I picked up a used iMac a few years later. It didn't have an iMac and it wasn't in the cards for us to even get a diskette drive for it, let alone, say, a USB Zip100 drive and one or more SCSI Zip100 drives for my other Macs. The iMac had Ethernet, but none of my other Macs did. The iMac had a modem, but none of my other Macs did (until after I got the iMac, at which point I found a 56k external modem) but I never mastered the art of dialing from one computer to the next. I should have been able to do it with real phone lines, but I was hoping for something involving a direct connection, for performance, reliability, and timing reasons. (It would not have been a leisurely experience to try to transfer a gig of crap from my Power Mac 7300 or Quadra 840 to the iMac via phone lines, but if I could have done it directly, the speed would not have mattered, and I could move things around much more frequently.)
Even if I had been able to buy a Zip or SuperDisk drive for the iMac as well as for my other computers, I would not have been able to use that to transfer files to other people. Literally nobody I knew had a SuperDisk drive, and my Mac SuperDisk mechanisms would not work on their Windows computers. I knew one or two other people with a Zip drive, but it was nowhere near universal, and there wasn't one at school.
The first USB thumb drives weren't available until several years after the iMac was, and even though Apple wanted you to believe that the "i" in iMac stood for the Internet, it wasn't really practical yet. Several years later (in 2000), Apple would address the issue with the iTools service, which did include iDisk, an online storage space. It wouldn't become particularly practical until a little later than that under the .Mac moniker. With .Mac, Apple provided some additional software (Anti-Virus, extra Garage Band content, and Backup) as well as space for a web page, an e-mail account, and critically, an online storage space that presented itself as a mounted disk on your computer. You could use Backup to make backups to either the iDisk (Apple's name for the storage space), an external hard disk or to writeable CD or DVD media.
.Mac only offered ten gigabytes of space. It would have been enough to back up writings, a few critical photos, a modest mail database, and perhaps your taxes or other financial data, but you would have been hard pressed to put a whole bunch of video up on it. It served the need though. Just the previous year, Apple released the "DV" version of the iMac G3 computers, which had the capability to capture and edit DV video.
The real trouble with this is that most people in the US were still using dial-up Internet at the time. I never had an iTools account, because I didn't have Internet connectivity at that moment, and I didn't bother using .Mac because it was costly, and I couldn't make very good use of it on my dial-up connection anyway.
Other online storage services wouldn't show up until almost ten years later when Microsoft SkyDrive and Dropbox showed up in 2007. Those services function differently, however, by syncing the contents of the storage space to your local computer. This made it a little better to sync on slow Internet connections, which made things simpler but perhaps not better.
Many people say that it was rewriteable CDs that ultimately replaced floppy diskettes. There is some, but I would argue, not a whole lot of truth to this. For starters, CD-RW equipment was uncommon and cost a lot more until much later. You couldn't easily put just a few files on a CD. You needed to keep the amount of space for your CD available on your main hard disk, because the process worked like this:
- Gather a bunch of files.
- Put them into a burning program like Toast or Nero, or on a disk image, with a Mac.
- Close all other tasks and applications.
- Burn the CD, make sure to test it.
Burning a CD (or later, a DVD, which needed even more disk space and took more time and was more expensive) wasn't just something you could do because you forgot to print your homework on your way out the door before school. It was a big project each time, and it was something you did maybe once every few weeks to get old junk off your computer, not something you could do quickly. There are re-writeable DVDs and CDs, but they are incredibly inconvenient for this task. If you carry a word document or a presentation to a collaborator's computer, they need to copy that file (and any others) off the media, then make their edits, and then completely re-burn the disc.
Flash disks came onto the scene in the early 2000s, but I don't think they were practical until a few years after their introduction. I came by some flash drives in the 64-256MB range in 2006-2007, at which point, networking had come out and in 2007, both OneDrive and Dropbox launched for free, offering a few gigs of file syncing and sharing space.
In the end, there were (and are now) lot of solutions, but most of them cost a lot, took a long time to materialize, and many of them presumed things that weren't true in 1998, and weren't even necessarily true in 2006. When I got a flash drive, it fell right in with what eventually materialized as my standard workflow. The flash disk was always a swing space, used for storing working copies or copies of only specific documents for transmission. My main document store was on the hard disk of my laptop, with monthly "Junk archives" going to DVD-R, and backups of certain files periodically going to DVD-R as well.
Later, buying big external hard disks became more reasonable, and even further after that, in addition to counting on destination computers having USB ports being a reliable thing, you could usually rely on other locations to have an Internet connection. In addition to all of that online storage spaces have become larger and cheaper. For backups, various cloud services exist which charge less than $10 a month. Some have gotten the cost down to below $5 monthly.
The challenge now is that we have an utter glut of good options for massive amounts of data storage. Pocketable, multi-terabyte hard disks cost approximately what just a few 100 megabyte cartridges did a few years ago. We're at the point where it's generally reasonable to presume that if you can buy a computer you can buy some kind of data storage for it.
It's hard to decide, if you have a computer to back up or if you need to preserve some photos, whether you should use the money on a cheap subscription service, or an external hard disk every year or so.
To use these cheap external storage devices with a new Mac, you need an exceptionally inexpensive adapter, which you buy a single time. Perhaps you buy two if your computer has more ports. You buy the new adapter a single time and then you use it as long as you have any devices that use that type of connector.
It will be inconvenient for a few years, and then new peripherals will come along and be purchased to use the new connector. The new connector will presumably last a long time, long enough for the vision of the standards to be finished and bad cables or improperly implemented ones to be weeded out. In all, I am confident it'll be a lot less damaging than the transition away from floppy diskettes ever was. There will, for starters, be a cheap, easy way to connect existing hardware to new machines. There are already peripherals with the appropriate connector, and as far as I have seen, they do not cost more than existing peripherals.
It will be a lot less bad than the removal of the floppy diskette drive.
- It was "Changes" by 2Pac Shakur. I had been listening to a MIDI rendition of it for a while, and we were both surprised when we heard the real song.