Skip Ribbon Commands
Skip to main content

Quick Launch

Stenoweb Home Page > Cory's Blog
February 19
Antsle and home-labbing and self-hosting

Meta: As per always, I wrote some of this this a few months ago but it got delayed for motivation reasons. Some of the details I put in here are based on some configurations that were available in around August of last year, and it appears some new configurations have been added.

A month or two (read: sometime last year), I discovered the Antsle via an ad on the /r/homelab subreddit, which I read sometimes, but don't post to. I thought it was an interesting idea, and the ad they aim specifically at the homelab subreddit is particularly interesting, because it basically claims to compete with VPS services from cloud providers.

In this article, I will not address privacy concerns: I think you either believe a cloud product is "fine" and it sufficiently addresses privacy and data safety concerns, or you do not, and you likely wouldn't use one anyway.

Antsle is a small virtualization appliance running a Gentoo-based distribution and some customized management tools. The hardware is mediocre: there are 4- and 8-core versions based on a now-old Intel Atom processor, and it can be equipped with up to four 2.5-inch disks and 64 gigabytes of memory. Antsle does you a solid by starting the configurations with SSDs, although there are HDD expansion options, and if the need arose, you could install an HDD later. (Although they say this voids any remaining warranty.)

The machines are extremely costly for what you get. The top end configuration with an 8-core CPU, 16TB of SSD storage, and 64 gigabytes of RAM is over $12,000. The base Antsle unit with a quad-core CPU, 8GB of RAM and two 128GB SSDs (a nice touch) is $760. It bears mentioning again the CPUs they've used are a high-end Atom. They should do most things fine, but they are by no means speed demons. (EDIT: the new upper ceiling is $15,600, although that machine has a better processor, a Xeon D with more cores and a higher RAM ceiling, plus room for more storage devices.)

The site claims the Antsle is for developers and geeks who want to use VPS services and get advantages of cloud computing, but don't want to, you know, actually use cloud computing.

The other issue I have with this marketing is that it seems odd to market an appliance to the developer and geek market. The Antsle is normal x86 computer onto which you could install another OS, but a huge part of their marketing is about using their mix on the ideas of virtualization and containerization. These customers are the ones most likely to want to run their own software stack or are willing to do the installation on the appliance operating system of their choice.

The Product page talks about easy access to your Antsle from anywhere and use cases such as hosting web sites, but it doesn't talk about how. They don't address the potential costs or inconvenience of using a home or small office Internet connection to run one of these devices.

The way this must happen is either your Antsle phones home (or another service) and your data flows through a remote datacenter when you access it from out of your home or office, or you must purchase an Internet connection with one or more static IPs that allows in its terms of service running servers, and a domain name (usually needed on cloud services as well.) This translates into a business class Internet connection, which I have covered before as being kind of a scam, but just for an idea, I have such a connection and five public static IP addresses, and I pay $180/month for it. That's nearly a $140 upcharge from what this speed would cost on my provider's residential. The other ISP in my area charges even more for lower speeds.

An Antsle is favorably priced if you do not count the cost of an Internet connection. The baseline Antsle would cost around $64 monthly over the course of a year. An average EC2 instance that's competitive with a low end Antsle costs around $52/month, in perpetuity, which includes a public static IP address. If you need to pay $100 over your current Internet costs to get a public static IP address, then you're looking at double what EC2 costs, every month in perpetuity, before you get any hardware.

At the lowest performance level, if you always have to pay for your Internet connection, and your Internet connection costs around $100 more for more speed, the removal of a quota, and the addition of a static IP address (this is just a guess, but) then the cost of an EC2 instance will never cross over the cost of running an Antsle.

That's only one situation, but it's worth noting that Antsle materials rely extremely heavily on advertising it as something on which you can run public-facing services. So, we have to presume that you're buying up to the highest possible Internet connection speed and purchasing one or more static IPs to use for service. The cost of DNS services and any software you might be licensing on the device will be the same for both EC2 and an Antsle.

It's worth mentioning here that regular tower servers are a lot less expensive than Antsle for configurations that are much faster. Dell has a configuration of the PowerEdge T30 with a much faster quad-core Xeon E3 processor, 8GB of RAM and a 1TB hard disk for $550 (down from the $760 base price on the lowest end Antsle.) The T30 is easier to maintain and upgrade, so putting in a second hard disk and configuring mirroring should be easy and inexpensive. If you step up one level in Dell's product line, you can configure the machine from the factory with solid state disks and multiple drives.

Part of the advantage of buying such a system from Dell or building your own (compared to the Antsle, specifically) is that you can balance the system to your needs. For example, the PowerEdge T130 lets you choose a dual-core Celeron CPU (which will still be faster than the Atom in the Antsle), 8GB of RAM and a mirrored pair of 1TB hard disks for around $740.

The trade-offs here are that the machine won't be completely silent and you are trading solid state storage for two slower hard disks, which should be fine for passive server workloads, even with several virtual machines. I personally tend to believe that silence is a little overrated anyway, but if you need silence, the best thing about servers is that you can generally locate them away from your primary work or living area. A small server based on a Xeon E3 processor will run fine in a closet or a big enough cabinet, even.

Buying a regular server has the same trade-offs against Amazon EC2, which is that you are using your own electricity and network connectivity to run the server. The reason I insist on including connectivity in this is because Antsle's web site shows a lot of usage of it to host client web sites. I'm presuming the goings on here are that a web designer or programmer would use it to host a client site. Web sites, even interactives ones, don't take an awful lot to host these days in terms of hardware, so you can either run one web server that listens on multiple names or a VM for each site, depending.

The other thing that needs to be considered is that residential Internet connections usually aren't good enough to host web sites. I have a connection with 20 megabits upload speed. It works fine for my personal needs, but it would be bad for paying customers. Service providers running cloud or even machine colocation services put a lot of effort into making sure that machines under their control stay running. An Antsle is still susceptible to power interruptions in your home or office, as well as network connectivity issues.

Another option that the hobbyist market (especially, say, /r/homelab, which is where I saw the device) usually pursues is used hardware. Even a machine such as a years-old business desktop that has been cleaned up a little bit and given some more RAM and storage will be much faster and much less expensive than the Antsle. The money could be used on purchasing more capacity or something like backup hardware or tools.

For developers with desktops, I can't help but imagine a better option is to buy a second storage device, some more RAM, and run virtualization software like VMWare Workstation or Virtualbox. If you don't need an entirely new computer (or a graphics card) RAM is… less expensive than an entirely new server, and an SSD or hard disk is inexpensive. The Antsle is pictured next to an iMac on their web site. If you are using an iMac and it is anywhere near replacement, a good enough Antsle to be worth not running your development environment directly on the iMac costs enough to offset the costs of just buying an iMac pro and removing that benefit.

One of the disadvantages of the Antsle is the funding model for the hardware, which exists for any on-premise service. Antsle is offered as a product that you buy, straight-up. If you could lease an Antsle or if it was available on a payment plan outside of funding you can get personally, especially if coupled with a VPN service to defray the cost of having a public IP address (which some devices such as the HPE EC200 are).

It looks like Antsle isn't offering this kind of funding, leasing, or payment plans directly, however. Dell does, but there's no indication that Dell's funding is any different from buying the machine on a credit card you already have, or using funds from something like a short term personal or business loan. Depending on the financing used, there may be different terms and the actual amount paid will differ based on taxes and interest.

Antsle is interesting, I understand that it's not for me, so it may look better to other people with different needs, but I struggle to think of the use case it is for. Perhaps the only situation I can think of is if you are a developer who needs a low-power server on which to do development, and you literally only own laptops, and also you live in an extremely small studio apartment or use an extremely small office space.

February 12
iMac Pro

Last year, Apple held a small event with a mere handful of big names in the Mac blogosphere. It was at this event we got the first whiff of the iMac Pro, which was merely mentioned as a "great" new iMac model in the pipeline. The event was to address the aging Mac Pro 6,1, which Apple discovered was designed in such a way the newest graphics processors can't be cooled by the innovative (but oddly specific) cooling system.

The Mac Pro 6,1 was widely panned basically since it was announced as being a bad successor to the Mac Pro 5,1, which was a large tower computer with room for two processors, four hard disk drives, two optical drives, and four PCI Express slots. Years after its introduction, the Mac Pro 5,1 was still lauded as the best system for creative professionals. (Note: the linked article appears to be written by a business specializing in selling customized Mac Pros and aftermarket options specifically for the Mac Pro 5,1.)

I won't belabor the point too much, but it's worth noting for all intents and purposes the 6,1 is a fine computer. The scalability problem allegedly keeping Apple from updating is the dual GPU design was done under the presumption multiple midrange GPUs would become the standard for creative professionals, mainly because the AMD GPUs available in 2012 weren't particularly great at running general purpose workloads at the same time they were operating as graphics cards. As such, Apple built a system meant for one CPU, but two GPUs. It created a system approximately 5% faster than the old one, at a great increase in cost, and at the expense of internal flexibility. So, to use a car analogy, it was like giving someone a sports car when what they asked for was a pickup truck or a four-door sedan.

Apple's solution is two-fold. The first step in the plan is the iMac Pro. The second is an upcoming "modular" computer, about which we know nothing, other than it will allegedly be a good successor to the 5,1.

The iMac Pro, as the first part of Apple's plan to replace the Mac Pro 6,1 builds on the idea of the 6,1 by integrating a Skylake Xeon W processor and Radeon Vega RX graphics chip (new parts at the start of their lifecycles!) into the body of a 27-inch iMac and adding a bunch of Thunderbolt 3 controllers. The idea is part of the problem of the Mac Pro 6,1 was finding good displays for it, due to its Mini DisplayPort/Thunderbolt 2 outputs. The design should be much more scalable, as Apple works in the future to keep the system updated.

The announcement of the iMac Pro's availability was met with a lot of interesting commentary, almost exclusively about the price. The baseline configuration is $5,000 for a system with an 8-core CPU, 32 gigs of RAM, 1TB of SSD storage, an 8-gig Radeon Vega 56 graphics chip, and what's likely the best 27-inch display you can buy. The top configuration is around $13,999.

Immediately, most of the commentary was (as it always is) about how you can build an equivalent PC Workstation for a lot less money. This is technically untrue and the argument almost always relies on a user not needing most of what's available in a configuration, and the fact that Apple often intentionally chooses only high-end parts out of a range. For example, there are desktop workstations from PC OEMs with Xeon W processors, but there are options to configure those system with quad-core CPUs and 16 gigabytes of RAM, which Apple does not allow. The other thing to consider as part of the iMac Pro's cost is its 27", P3-capable 5k display, which isn't available inexpensively anywhere else. The nearest configurations from Dell can match the iMac Pro's CPU, memory, and storage configurations, and then use a low-end GPU and low end display. This is, of course, the beauty of the generic PC market – not everybody needs Apple's 5k display or a Radeon Vega GPU, and building a different system allows you to put that money into different things.

The interesting comparison I heard from a lot of people, and it was surprising to see this from some Mac power users, was a "workstation" built using high end desktop enthusiast parts would be well received. This is where we start to get into some interesting discussions about what makes something a "workstation."

Traditionally, something was a workstation if the vendor called it as such. Computer vendors have traditionally been careful with the w-word, because it meant claiming you believe your product is a step or two above the competition. For most of the 1990s, this meant while the Mac and PC markets contained professional computers they didn't contain workstations, because for every Power Macintosh 9600, there's a much better equipped SGI Octane or Sun Ultra or Compaq AlphaStation with 64-bit processors, a better operating system, faster networking, properly implemented and much faster SCSI subsystems, and so on.

In the 2000s, workstation-class hardware started becoming less expensive and the money needed to rev up all the different platforms wasn't quite so available as it had previously been. Intel had been building chips suitable for low end technical workstations and Windows NT was up to the task of being a workstation OS. AMD also produced the 64-bit extensions to x86 and licensed them to Intel. Apple had just acquired a workstation vendor, NeXT, and started coupling the better OS with some of its newest hardware.

In the early 2000s, Apple started advertising its hardware and software to classical RISC UNIX workstation users who were looking for a modern platform, especially as some of the RISC UNIX vendors failed to commit to building new workstations based around their old UNIX operating systems, either on the old processors or on other hardware.

I've had a lot of discussions with people on the finer details of this point. It is argued that because Apple was putting its foot almost exactly up to the "workstation" line with ads such as Sends Other UNIX boxes to /dev/null and efforts such as the Xserve G4 it's safe to say all Macs running OS X at that point were workstations because of Mac OS X. I tend not to agree with this point because Apple had built UNIX systems before, none of which it tried to classify as UNIX workstations in the sense that, for example, a Macintosh IIfx could compete against a SPARCstation. By 1990 when the IIfx came out, Sun had moved onto the SPARC architecture and in raw compute numbers, a SPARCstation was a couple of times faster than the IIfx.

The change in the mid-to-late 2000s is "workstation" as a term went from meaning a machine that existed in a different performance class from normal desktop computers to something that is qualified to run specific applications, or is qualified by particular hardware features, regardless of what performance class a machine is. (For example: error correcting memory.)

I have been talking about The Plateau for a few years now. I should probably start a page or category for it. The relevance here is leading into the 2010s, new workstation products started to use Intel Xeon processors aligning very closely with mainstream desktop platforms. This ended up representing a new low end in workstations in particular, positioned for "entry level" work and certified often for tasks such as viewing CAD files, 2d CAD, software development, general purpose UNIX chores, and so on.

This is speculation, but my theory is Apple, upon starting to use the term "workstation", decided its workstations should in the traditional sense be a step or two above normal Macs. There was a period of crossover in 2009-2010 where quad-core iMacs were starting to exist, at a time when the baseline Mac Pro was a quad-core configuration, for just $200 over the top end iMac. Then the two lines diverged again, with the Mac Pro clearly demanding a premium for its performance and reliability improvements over the iMac

Over the course of a few generations of iMacs getting new processors and graphics and with the Mac Pro 5,1 then 6,1 standing still, the gap narrowed again, but Apple has reopened it with the iMac Pro. Keeping that gap open requires that Apple keeps the machine updated, but that should be easier to do with the new thermal configuration.

The internal upgradeability of the iMac Pro has been discussed a lot as well. The memory can be replaced, but only at a service center. Tear-downs reveal that the solid-state storage is on modules, although the modules are unique, and that the processor can be replaced.

Storage flexibility was a prominent criticism of the Mac Pro 6,1, particularly as it pertained to the machine's video editing credentials. (Important, given that Apple pretty much designed it explicitly for 4k video editing) I don't know the state of the ultra high end video editing today, but up to that point, it was common for the highest end video editing systems to feature not multiple internal hard disks, but FibreChannel or SAS cards to connect to disk arrays. They fit in well with the large video tape recorders, other import/export equipment, sound boards, and interconnect boards, plus program monitors that often end up in the highest end systems.

I think criticisms regarding upgradeability are a little misplaced. Primarily, Mac users remember (or hear about) a time in the distant past when the machines could be upgraded with new parts and processor generations well beyond what is considered reasonable or necessary today. These upgrades often didn't deliver anywhere near the potential performance of a whole new computer, but I understand the appeal as a way to get a little more life out of a machine in an era when the bare minimum baseline price for a fast new machine is over $3000.

Upgrades for capability and capacity, like RAM or storage, would be nice to be easier to do, but they appear to be doable and storage upgrades housed externally are in a better position than they have ever been, so it's not particularly worrying to see, say, a machine where the primary way to add storage is via ThunderBolt 3.

January 15
Adobe’s Profitability and Licensing

Meta: I wrote this a few months ago, but I'm finally posting it now. I've since had an opportunity to poke at some of the things I wrote here, some of which became true, and will have more thoughts on that later.

Outside of news at the high end of the enthusiast desktop microprocessor market, tech hasn't done anything that I specifically want to write a lot about for a while. I'm getting back into the swing of things after not posting for a while, due to NaNoWriMo, and I figured an easy thing to talk about would be Adobe. This is partly adapted from a tweet (and replies) I posted a few months ago.

For context, Adobe released some earnings information a few months ago.

Perhaps the most relevant bit is this:

Adobe achieved record quarterly revenue of $1.77 billion in its second quarter of fiscal year 2017.

Adobe says a few more things here, but what it boils down to is essentially… they killed off most of their perpetually licensed software products and replaced them wholesale with services that include desktop software.

Even though this ostensibly happened a few years ago, profits are up-up-up. It makes sense, you could buy Creative Suite 6 up through around a year ago by calling in, but that has since stopped.

Adobe Creative Cloud is kind of a near-and-dear subject to me in a weird way. Long ago, I was a photography student, so I came to the university with my Mac and very shortly after it was available, I purchased a copy of Adobe Creative Suite CS3 Design Standard, and a copy of Dreamweaver CS3 on the side. I did this because I wanted to be able to use Photoshop and Bridge, with Illustrator and InDesign, and I wanted to build a web site, but I didn't particularly care for Flash.

I used that copy until I stopped having a Mac and then I handed it off to another person who needed it and had gotten a Mac. They used it until it stopped working well with the current versions of Mac OS X.

Part of why this was possible was educational pricing. The other part is due to the perpetually licensed nature of software at the time I could keep using these tools for several years without paying for them over again. I had what I needed, I didn't have things I didn't need or couldn't use, and because I wasn't swapping files with other users of these programs, issues surrounding format compatibility weren't important.

When Adobe announced Creative Cloud, it was initially an alternative licensing scheme. It looked like it would be a great deal for people who frequently needed to buy newer copies of the software to keep up with feature needs or with collaboration, and for people who needed all or most of the different functions.

Despite the fact that Adobe didn't (and honestly, still doesn't) do a whole lot to make Creative Cloud really compelling as a cloud service, it's compelling pricing and it's interesting and likely effective as a way to encourage people to stay up to date, but as Creative Suite 6 is no longer available and as Adobe makes it more and more difficult to purchase licenses to Acrobat and Lightroom separately of the rest of the Creative Cloud (or outside of Document Cloud and the Creative Cloud Photography plan) it becomes less and less compelling for people who don't need all of these products to stick with Adobe at all. Lightroom 6 appears to still exist, but you need to dig deep to find it. The same is true of standalone versions of Acrobat Professional.

I would be more amenable to the idea of, say, using an educational discount if Adobe's terms for that product didn't dictate that students can now only use the "special" educational rate for one year. After that, the rate goes up to $360/year, which is still less than the $600 yearly retail cost, but less easy to swallow than the old cost, which might have been $300 once in an educational career, and certainly less amenable than the $240/year special they advertise heavily to students.

If I may, I'd like to take a detour into Office licensing. Microsoft releases new versions of its Office software suite every "few" years. Historically, they change the file format a bit less than once a decade, and they allow any "supported" (basically, today minus ten years) version of Office to connect to hosted services such as OneDrive, SharePoint Online, and OneDrive for Business.

Buying a full, perpetual copy of Office is still relatively easy to do and has always been an "expensive but not that expensive" proposition. $150 for the home version and $400 for the professional version is the current pricing. Buying Office 365 access ranges from $80 for four years to $150 yearly, depending on the customer and the desired functionality.

For the subscription service, Microsoft does the right thing by integrating Office with services people are likely already to have (Skype, Hotmail/Live/Outlook) and adding benefits such as a terabyte of storage space. On the "regular" home version of Office 365, the software can be installed on up to five Macs or PCs (compare with "One" for Adobe, and with no good way to license it twice for convenience) and up to five sub-accounts can be created with their own e-mail, Skype, and OneDrive service. In a family situation, these services can be used to reduce the cost of licensing Office for everybody's computer.

Adobe, on the other hand, provides 20 gigabytes of storage space which appears to include the space needed to host your portfolio web site. I can't imagine in what context 20 gigabytes of space is a particularly useful online storage bucket for tools like Photoshop and Premiere Pro. It's not unimaginable for a single large Photoshop project to exceed 20 gigs. The photo upload folder from my cell phone is just shy of 15 gigs. A single particularly active day shooting with my eight-year-old digital SLR camera can yield 15+ gigs of data. My newer camera has a 128-gig memory card in it, and can shoot video.

It seems preposterous that 20 gigabytes of space would have any use at all for work in most of Adobe's applications. And then Microsoft goes and gives you a whole terabyte of space to use for your resume and your taxes – documents that use mere kilobytes of space.

There are rumors (although, nothing solid from Adobe) that a next-generation Lightroom component or perhaps a stand-alone Lightroom service will offer more storage space, to do something Apple and Google (and Microsoft, to a certain extent) have been trying to do for a few years now: put The Cloud at the center of a photo workflow.

I think this could be the pivot that would make paying monthly for an Adobe cloud service centered around having enough room to store a full photo and video library make sense. In total, I probably have a bit under 400 gigabytes of photos and videos I have shot over the years. A perpetual problem of mine has been managing the library with multiple computers, and quickly accessing images when I am not at my "main" or "photo" computer. (Mostly because good software to manage photo libraries costs a lot to license for several computers.)

The rumor that has been floated was that Adobe is working on a browser-based version of Lightroom that uses the library stored primarily in a new terabyte of online space, and presumably syncs to mobile and desktop versions on client computers. With the correct synchronization and setup (As I mentioned on Twitter, I have wanted to use my iPad to view and organize photos since I've had one, which was literally the day they were available at retail) it should be possible, perhaps even easy to pull a bunch of new images into an iPad and have them magically go to your main photo computer and your online account, where they are backed up.

iPads are fast enough to view and work with photos, and their beautiful, accurate, high-resolution displays should make them particularly good at it. Apple has also been selling USB connectivity hardware to make image transfer from cameras and storage hardware for long enough that it feels like a shame this doesn't already exist.

This one change on its own doesn't justify the fact that Creative Cloud is $600 yearly. It seems problematic that there's no good way for somebody who wants to run a simple pure HTML web site to do so with Dreamweaver, or that Flash should be perpetually locked behind one of these plans.

I don't know if there's a good way to solve this problem. Without doing things like watermarking the output of programs (which can dampen even hobbyist or educational output) or returning to a system where subscription tiers are based on what applications are available, and adding the highest end applications, such as those used for audio and video processing, is what gets you to the most expensive bundles.

I understand why Adobe did it, I understand the product stack simplicity and the fact that this allows users to grow into parts of the suite they may previously have considered unavailable. (For example: using Premiere Pro to build videos to put on your web site, to add a video track to a podcast you edited in Audition, or to build a title sequence for your video in After Effects.)

As somebody who uses other tools for some of these tasks, and doesn't have time to grow out of some of the more basic tools, the best option really does appear to just, as I said in the last tweet of the thread, go elsewhere.

Not every Adobe product has a viable competitor, but realistically, most of them do. If my focus is print design with inDesign, I can go look at Microsoft Publisher or Quark XPress (whose financials might be interesting to look at these days.)

In the early days as Adobe started to push Creative Cloud and announced that CS6 would not be updated and would not be replaced with a CS6.5 or CS7, many opined that it would be their undoing. It appears clear now that Adobe has no trouble maintaining profitability with every product they sell being a subscription. I would further argue it has always been clear subscription-based licensing is best for software vendors, and that they have a lot of incentive to move in that direction. The primary limitation has always been connectivity, which by the late 2010s had been resolved.

It's possible that in the overall computing market, the good of this will outweigh the bad, as people who can't bear Adobe's licensing model move to competitors that once looked like they would languish to their death (I'm looking at Quark with this one), and as new competitors (Affinity, Pixelmator, Pinegrow) appear to attempt to undercut high end graphics software.

As such, I don't think Adobe is looking to "fix" the "problem" that their software is largely inaccessible. I don't even believe they believe it's a problem. There's also the question of what it means to be accessible, in this sense. Just because you could go to CompUSA and buy a copy of Flash MX or inDesign CS off a shelf, were those products any more "accessible" at their prices and with the complexities of licensing them, especially with issues such as upgrade pricing and cross-platform changes.

This kind of issue is part of what motivates open source software developers, which is good. However, most open source software makes a poor or barely viable replacement for these kinds of tools. Often, while you're learning generic concepts in an education program, you're also learning the mechanics of specific tools that are common in an industry. Using a variant of Blender meant for video editing may be worthwhile for a home movie, and is probably even a good tool in general, it's unlikely to be or to even be like what is used in professional contexts, and could teach "bad" workflow habits or techniques.

Likewise, the interface on GIMP is intentionally very different from that of Photoshop. Compare with LibreOffice, which aggressively styles itself after Microsoft Office 2003, for a variety of reasons. Scribus, similarly, avoids making itself look like InDesign or Xpress, the industry standard print layout tools.

Whether this is meant because these developers think they can design better software than professionals who both do this work and have been working on this software for years, or if it's done out of malice, I couldn't tell you. That also sidesteps the fact that most of this software simply isn't set up to deal with certain issues. Scribus is, at best, a competitor to Microsoft Publisher, and GIMP is, at best, a competitor to Paint Shop Pro, essentially.

At the end of the day though, a budget is a budget and it's up to computer users to decide what theirs is and find solutions that work within it. Adobe has never been about building budget conscious software, and Creative Cloud is nothing if not an affirmation of Adobe's believe that they are at the top of their markets.

January 08
Quick Computer Security Thoughts

Meta: It's been a while since I've posted! I have some other things in the works, but this was a quick gimme based on some recent events, and it has been good to sit down and write it.

Last week, the early announcement of the Meltdown and Spectre attacks surprised many, although not by an awful lot. Word had started to get around on twitter the night before, and I managed to get in an early word.

As a quick re-cap: Meltdown and Spectre are two newly discovered and disclosed security vulnerabilities. Meltdown applies primarily to recent Intel chips and allows unprivileged processes to access all memory locations on a computer. Spectre is a bit harder to pin down, but the important things to note are that it uses out-of-order execution and branch prediction on modern CPUs to provide access to memory locations.

The early buzz was entirely about Meltdown. Before all the information was out, people were reporting patching it might cause a 30% slowdown. Of course, in real testing, most user-facing workloads have a much less severe penalty, although many server tasks will have trouble.

The looming specter of the whole situation, though, is Spectre. Spectre is more difficult to exploit, but could have much more severe impacts, and will be much more difficult to protect against in software.

Permanently fixing vulnerabilities related to Spectre will require entire new CPU silicon. A few proof-of-concept attacks on Spectre are already being patched against, but there are so many possibilities it's likely server computer hardware (and, anything running a desktop-class OS as well) can't be considered "safe" until it's simply replaced by a new generation of hardware.

The next generation of computer hardware currently in design phase likely doesn't fix for this. It's possible that the generation after that also won't fix these problems in hardware. We're looking to systems that are two or more generations away to fix this, and that will take between two and five years, just depending on what's needed to mitigate these risks.

Once a CPU is designed and verified, there will be the matter of producing enough of it to meet demand. Cloud service providers and enterprise datacenters will be doing their best to get at these chips first. In a situation where hypothetically every server system doing work in cloud or service provider setting needs to be replaced, it could be years before silicon is available for consumer and desktop applications.

I think by the time this is published, the moment of true widespread panic will really be over. A huge rush on server-class systems may or may not appear in a few generations, and even if it does, it probably won't look too abnormal for a processor generation launch. The chipmakers (Intel and AMD) may do a server-first release, but they may not bother, opting as they generally do to build consumer silicon first.

My prediction is that the hype will pass and that systems departments will, as ever, increase monitoring and attempt to decrease exposure. In theory, this is what a good information systems department is already doing, so it's going to be a matter of doing the same thing, but more, instead of doing a different thing.

On the desktop side of things, I think that following guides for security such as Decent Security is as important as ever.

I can't stress this point enough. It has been fun to watch the vintage computing circles on Twitter fall over themselves to come up with the most creative ways to avoid these hardware vulnerabilities, and it has been in good fun, but a necessary side-effect of digging out a twenty-year-old machine to avoid a modern hardware vulnerability is that software vulnerabilities are re-introduced. This is especially true on anything running closed-source software, or for which modern releases of open source software are no longer available.

I love pulling out my old computers, but everyone should keep their modern patched Internet-faring computers ready to go. Part of regular security operations in the computer industry will involve vendors releasing more patches for Spectre-class vulnerabilities as they're discovered. Users of modern operating systems that get patches from a vendor will benefit from those patches as they become available.

Meanwhile, most of my desktop systems have already been patched against Meltdown. My Mac downloaded and installed the patch before the new year, which might say something else about the state of information sharing among security professionals, but that's for later, if ever. My Windows systems also have the patch and as I predicted on Twitter, I haven't noticed any difference. I have yet to go play a game and record my screen and a camera at once, but I'm not particularly worried about that working well, it was fine last time I tried it.

Personally, I'm not making any big plans to rush out and replace any hardware I know or suspect to be affected by Spectre, mostly because there's nothing better right now. My server and my desktop are each pushing six or seven years at this point and depending on what my budget looks like in a few years, I think I will be able to make a relatively easy case for replacing either of them. My laptop is still less than a year old. Its replacement isn't even on the thought roadmap yet.

I think Spectre has larger possible implications for the model of centralized services and the cloud, but that will have to wait for another time. New vulnerabilities are always exciting, but the takeaway should still be to run a modern OS, patch it regularly, and keep an eye out for possible trouble and monitor the machine's behavior.

July 31
How Software Changes

One of the issues I think about from time to time is this idea that computers get faster and when computers get faster, that straightforwardly means that the software that runs on them automatically benefits.

This is true to a certain extent. You can usually do things like overclock a CPU or put in faster storage or more or better memory and get benefits such as applications that run faster. However, it's not always the case.

It's important to acknowledge that there are different ways computers speed up, and different ways software applications use the speed that's given to them.

This stuff tends to be more directly important when you consider non-trivial computing tasks. It doesn't really matter how much faster a system gets for things like Word and Outlook, because you can only launch Word so instantly, and word processing has largely been a task where computers have been waiting for the user to do their part for the past twenty to thirty years.

Video and audio, on the other hand, as well as other higher end applications, present interesting scaling challenges. The way we do these things has shifted a lot, even in just the past ten years, let alone the past twenty or thirty years.

Video is the example I happen to have most experience with, because it was around ten years ago that I was trying to get into some video production stuff myself. I had been doing some light editing and had been experimenting with digitizing VHS/S-VHS tapes at the end of high school, and I had been given use of a DV camcorder for much of my senior year of high school. I assisted with the on-campus TV station when I got to the university, and did so for a few years.

At the time, dual core CPUs were new, but dual processing as a concept really wasn't, it was just that it got more compact. Quad-processing was appearing in workstations, but that's not such a huge junk.

In the late '90s and early 2000s, working with video on a computer tended to be very tape based – either in the sense that you were importing or reading video from a tape directly, or in the sense that your digital file format was itself somewhat tape-like, or based on tape operations. In the early 2000s in particular, some of the early tapeless digital formats for professional video capture came onboard, such as Sony XDCAM and Panasonic P2, and although you could copy the files, the mechanisms were entirely still there to capture this video from the camera or from a deck to the computer.

This goes deep. Deep enough that at the time, people and tutorial books and even software vendors cared deeply about ensuring that people using the computer to edit video were, in essence, as polite to the machine as possible. What this generally meant was using file formats that were friendly to video editing. Ideally something like Motion JPEG, which was just what it sounds like: video compressed using a collection of frames that are each JPEG compressed.

Most compressed video is compressed in some way or another using two kinds of frames: key frames, which contain the entire picture, and intermediary frames, which describe differences between the last key frame and the current frame. The idea here is that you can make video a lot smaller, especially video where lots of things stay the same, if you either cleverly place key frames or if they're spaced relatively evenly throughout a static image. Motion JPEG and likely other similar formats work by making every single frame a key-frame. In the '90s when you could get a Silicon Graphics system to capture video, Motion JPEG was a common way to do it, because the compression was otherwise too difficult to do in any kind of real time.

Key-framed video works well for distribution, but the thing that most video editing guides were sure to mention at the time was that this type of video was more difficult to edit, for a variety of reasons. The guides weren't wrong, per se, but in retrospect, reasonably robust video editing software handled this problem with relative ease, and software that didn't got around it usually by simply refusing to import video of the wrong format. The point here is that this suggestion has almost entirely gone out the window. It's just accepted that (in general) modern software can deal with this. Similar to the key-frame issue is other issues surrounding what particular compression codec gets used for video you're editing. In the mid 2000s, h.264 video existed, but it was a big no-no to edit on that type of video, again, out of politeness to the machine that would need to do it.

And again, this changed. We now cut all day long on formats like AVCHD and MP4 files that use h.264 and even h.265 compression, because cameras and phones of all kinds easily compress good looking video into these formats, and because computer horsepower is so cheap that it doesn't matter if your video editing software needs to compensate for where the key frames are. (It never really did, especially in Final Cut Pro, which always edited on references to video files, rather than by placing absolute cuts in source files.)

At some point along the way, we basically got tired of the limitation that video must come into a computer in exactly real time, and the whole process got much better for it. Part of what enabled this is that cameras changed. It was probably going to happen as flash media (such as Compact Flash and SD cards) got better. Along the way, we had interesting ideas such as disc-based cameras, and there were hard disk fanny packs for DV and HDV cameras, but flash media won out in the end.

The other part of what enabled this is that computers got faster. As you move from one, to two, to four, to now eighteen cores in a "high end but not unreasonable" desktop computer, your ability to get the computer to do more things for you or just deal with adverse conditions improves. Of course, at any speed, video editing in particular is helped by having more memory available and faster, bigger storage. Graphics processors are also a big help. In the early 2000s, a system's GPU generally only impacted how quickly it could render things given to it by the CPU, but that started to change, especially as the 2010s rolled through, with the graphics processor itself taking a more active role in what was displayed. Today, GPUs do all manner of video editing tasks in lieu of or as an assistant or coprocessor to the CPU. It's to the point where in reality, having a good graphics processor is likely more important to most video editors than having a high core count desktop CPU.

So, computer hardware in that respect has changed a lot in the past ten years. If I go pick up a copy of Final Cut Pro 6 from around 2006 or 2007, it will in fact run on a brand new Mac, but it will get exactly none of the benefit from all of the advances made over the past ten years. Final Cut Pro 6 stumbles all over itself, badly, if you aren't so kind as to give it the correct files, and it (as a 32-bit application) makes very poor use of high amounts memory. It also doesn't make great use of a lot of RAM.

You can edit a video with Final Cut Pro 6 on a modern computer using modern files. It lets you do things, but it's a deeply unsatisfying experience as you see only one or two of four threads, even on an older Mac mini get used, and as you see only two or three of eight gigabytes of memory get used, and as you need to do things like render previews of the video with every simple edit.

Some of these things are just par for the course in terms of how video used to work. Even when editing with the DV file format, which is one that Final Cut likes, you had to take frequent render breaks or spend a lot of time just guessing at what a final product would be like. Final Cut did nothing in the background, because the hardware of the day just couldn't support it. (It's worth noting here that in overall system performance, a pretty mid-range business desktop from around 2011-2012 is probably around four times as fast as a high end workstation from five or six years earlier in 2005.)

The reason I keep picking on Final Cut Pro here is that in 2011, Apple introduced a new version of the software, "Final Cut Pro X" to replace Final Cut Pro 7. I think it would probably be fair to describe Final Cut Pro X as a complete re-imagining of what editing video should be like in the modern era of computing.

Remember: in 2011, an iMac had a quad-core CPU, a powerful discrete graphics processor, could run 16 gigabytes of RAM, and could accommodate SSDs that were much faster than any PowerPC G5-based computer or even the first generation of Intel-based Macintoshes in 2005 and 2006 could. Such fast storage barely even existed in 2005, let alone on the Mac. Some high end Macs could use that much RAM, but it was often never installed because 64-bit software didn't start existing on the Mac until after the move to Intel CPUs.

Apple's movement on this issue was faster and more sudden than almost anybody else in this industry, mainly because that's just how Apple does things. A lot of the movement in the program was based around taking things from a strictly filmstrip-based perspective and letting the software do more guessing on your behalf. A lot of it was based around the idea that as a content creator, you shouldn't really have to care about the technical details of the content. Final Cut Pro does a lot of things in the background, and the way it works as a program also encourages working more quickly. In general, for example, editors spend less time waiting for renders to happen, because Final Cut renders video at low speed in the background while you're working. In addition, the timeline just supports playback of more types of files, so a render doesn't need to occur to play video recorded by a webcam, iPhone, or anything of that nature.

On the down-side, there was a point at which if you were a Final Cut Pro editor, you had to buy a new piece of software and then spend time re-learning and pre-perfecting techniques you used to create a certain look. On the up-side, your workflow could become a lot faster and you weren't anywhere near as constrained to particular file formats or doing pre-processing before you could start editing. (Another common meme from the old days: rendering proxy files to edit on, mainly to make up for particularly bad storage in laptops and low end computers, only to re-attach the originals and re-render the output at the end of a long project.)

Some applications, such as Adobe Premiere Pro have kept up with these trends. Others, such as Avid Media Composer appear to have doubled down on what I consider to be some particularly bad habits.

This leads me to pivot into audio a little bit, because the real context for some of these discussions has been that somebody on Ye Olde Computer Forum has asked for some wisdom (in different words) on buying some hardware to form a Pro Tools HD setup, circa about 2003-2005.

The details here, which we actually discovered a few pages into the thread, are that this person is using a version of Pro Tools Native on a laptop that is a few years old. Inexplicably, even though the program isn't really using a lot of horsepower or RAM, the program stops working abruptly and gives an otherwise unidentified CPU error. The reader is left to presume that this probably means that the program hit some part in this person's audio file that is so complicated, it takes 100% CPU power to process, tops out, and then can't continue because a frame was dropped. A vexatious problem in any real-time media application, and a huge reason why in the days of yore with video, you might capture low-resolution proxy files, render or convert everything to an easy to use format, edit on that, and then let the computer chug for a day or so re-capturing all your video and meticulously re-assembling your project in high resolution.

This person wants two things:

  1. To use the Pro Tools HD processing cards (in this case, PCI-X cards) to built out a system (which would be a desktop from around 2003, compared to the existing laptop from around 2012) hopefully avoiding the mysterious CPU error
  2. Pro Tools HD 8 (the software version they will get) has an effect they want to use, that didn't become available in the "Native" (software only) version of Pro Tools until recently.

I think ultimately the person wanted us to say that yes, in fact, it's reasonable to spend $600 (the price for the cards they wanted to get) for some hardware that would enable them to build out an audio system circa around 2003. To do this, they would need one of a very slim selection of PCI-X having computers from the time, enough stuff to make that computer go and maintain it (to be fair: they probably have that, it is a vintage computer forum) and then they would need to relearn an older version of this software, only to perhaps find that because their laptop is going to be massively more powerful than, say, a Pentium 4 workstation or an early revision of Power Macintosh G5, their production needs may still be unmet.

Here's where Avid sort of looks bad, I think. There is almost certainly no good reason for the "native" version of this software to be locked down the way it is. This person has a laptop with a quad-core CPU, a good GPU, a lot of RAM and potentially a lot of very fast storage. They have a desktop with 8-12 CPU cores and capacity for at least 32 gigabytes of RAM, plus PCI Express expansion slots for faster storage. Other audio applications would almost certainly allow for around about as many capture channels as that hardware can allow. Avid is using modern single-purpose DSP cards dictate licensing and feature levels on its software products.

To me, doing it this way pretty much ignores that there is now better ways to do this work. A modern CPU can almost certainly outstrip these DSP cards, whose only real function appears to be for compression and to enable certain effects, even though neither of those things should require specialized hardware any more.

It's reasonable that a single or dual-CPU workstation from 2003 doesn't have the horsepower to do this. But, something from a decade later? The thing people in the discussion said was that "audio hasn't changed" – with the implication being that this was a "solved problem" and that as with word processing, there aren't improvements that can be made in process or efficiency.

Of course, I don't consider audio to be a "solved problem" especially if here in 2017, when we're getting 8-core CPUs at the mainstream desktop level and 12+ core CPUs at the enthusiast level, before we get into actual workstation and server CPUs, you still need thousands of dollars worth of DSPs from the '90s to do compression on audio to capture it to disk and then play it back.

For better or worse, I think the solution here needs to be that communities using these tools need to look at Avid and ask why this is the case. An iPhone or an iPad can easily record multi-track audio. An interface for doing so is of course necessary, but outboard processing hardware really shouldn't be.

If I were an Avid customer today, I think I would either be inciting a mutiny or I would simply stop being an Avid customer.

The conclusion to this sub-point is of course that this person has either already bought the Pro Tools 8 kit, or they're going to anyway, because a bunch of pointed leading questions about what guides their needs and what might make the best use of hardware they already have is not worth the time and effort.

I get that people doing creative things with their computer just want to sit down and do it, but this stuff is usually worth discussing because if a change in tools can lead to better results or faster results, then the justification not to do it seems thin. In the case of Pro Tools, I think something needs to be asked about what really causes the CPU error. I know that with my video editing work (when it appears, which I will admit, it is infrequently) newer software will immediately lead to a speed-up in my work, just because it will be able to better take advantage of the modern computers I have, and it will work more easily with the modern file formats I use.

I don't have particularly concrete examples, but advances in computer hardware generally need to be matched by advances in computer software, generally, to achieve the most meaningful productivity increases for non-trivial tasks. Computers may not feel faster, although a side-effect of much of what is improving in the past five to seven years means that they should in fact feel faster, especially as operating systems get fine tuned and as application software gets updated to take advantage of new hardware configurations.

This isn't exactly a continuous climb, though. If an application is, say, 64-bit aware, it doesn't really need to become more 64-bit every time new computers that support more RAM come out. If an application is multi-threaded, it doesn't necessarily need to become more multi-threaded each time a new generation of CPUs comes out. What needs to change, of course, is that if RAM ceilings jump enough that your application struggles to get a performance benefit out of it, when it should, or when an application isn't designed to use more than a certain number of threads, or when CPUs are so heavily threaded that there's room for the application to do more work at a time.

There will be a point at which something will cross over from being something difficult to something easy, perhaps even trivial. I would say that video is there, but it's really not – advances in video capture technology and the fact that people will always want or need things like effects, multi-camera operations, different output formats, and so on will likely mean that performance enhancements in computers will be meaningful to video editors for the nearly predictable future.

However, something like photo management and even things like print design and web design, something that in the 1990s was reserved for the highest end of computers, is now something any random laptop, even a $300, can easily do. Illustration and low end CAD tasks don't need relatively powerful computers any more. Other things like programming, virtualization, and even still image editing really depend greatly on the technique and a few other factors.

As always, I think it's an exciting time to be interested in computers. I would be lying if I said I thought that there was truly an unexciting time to have something to do with tech. One of the things I'd like to do over the next few weeks is get my hands on some free/trial software – Final Cut Pro and Avid Media Composer First at least, perhaps Adobe Creative Cloud (for PremierePro) and do some video editing testing. I want to see what I can push a few different systems I have to do and what the experience ends up being like.

I of course have my copy of Final Cut Pro 6 and have worked around some of its quirks. I have Avid Media Composer First installed on the system, and so that is probably what I'll use and test first.

One other thought that I haven't mentioned about Final Cut Pro 6: It and the other members of the Final Cut Studio 2 bundle that I have suffer severely from changes in Mac OS X. I can't at all in good conscious really recommend anybody try to use this as a day to day editing tool on a modern computer. This is perhaps one of the most salient points I can make. It's already badly non-performant on something like a Mac Mini from 2011. It will run and technically work on something like a much newer iMac, but each time I go to use it to build out a project, I spend more time fiddling with the software itself, dealing with, say, breakages in the way side-utilities such as Compressor work than I do editing the video. I end up producing sub-standard or incorrectly compressed files and hoping YouTube fixes things on their end, problems I wouldn't have if I used something "more modern" – whether that's iMovie, Final Cut Pro X, or some other tool.

It's to the point where, even if I had something to say or something to record and I thought I had done a good job with recording, I dread doing the post-production, so I never will. That's a different issue and video has always been one of the more complicated formats to work with, with so many different parts that ultimately matter to a good production. Some of that is a different issue, some of it is needing to make the things you can easy to do so that the overall process isn't too overwhelming.

June 26
Data Storage Dilemma

With the most interesting major tech announcements of the way for a while, and with a new laptop easing my mind in the way of "how will I write when the Surface 3 dies?" I've had time to think about other things. Not that I have, of course.

Instead, I watched YouTube, and in the back of my mind, I was thinking about the thing that deep down, we all know I really want to do: Save every YouTube video to local storage so I can fall asleep to the sweet, dulcet tones of people recording their progress in Cities: Skylines. And then watch those files again later when I want to see what happened.

Video, both legitimately downloaded into my iTunes library and from things like podcasts, isn't the only thing that eats disk space on my systems. About ten years ago when data was much smaller, my solution to this problem would be to burn a new disc every month or so with data I wanted to keep but didn't need on my disk any more. It worked out well because with the slow Internet connection I had and the relatively slow rate at which I created or otherwise acquired data, there wasn't an awful, one or two DVDs (or if I was feeling spendy, one dual layer DVD) was enough.

Today, writeable DVDs and Blu-ray discs exist, but preparing them is as inconvenient as it has ever been, and these discs, which are costly in their write-once form and even more costly if you try to reuse them for backups, have been massively outpaced by the falling costs and increasing capacities of things like external hard disks and USB flash disks. Optical media is often unreliable, long term. Most of the CDs and DVDs I burned in the early 2000s have degraded to the point that it's questionable whether I'll get the data off them. Any other form of relatively capacious external storage device is very expensive and very enterprise focused. The next best thing, DAT320, was more affordable than LTO, although less robust and also now discontinued.

It strikes me that it would be great to have a modern removable data storage format that's more robust than hard disks, bigger than flash drives and blu-ray discs, and ideally fast.

The problem is of course that there are always compromises. You can't, say, build a storage format that's capacious, fast, and cheap. If you could we'd all have LTO tapes at home. I think that the trade-offs are going to be in capacity and in speed. It won't be as fast as real external hard disks or as big as LTO tapes. In trade for being pretty cheap and being treated like external media, I'm imagining it'll either be a new form of optical or magneto-optical media, or some kind of flexible magnetic storage, in the style of something like zip drives, or perhaps Bernoulli. Honestly, I wouldn't even mind if it was massively cost-reduced DAT/DDS media that had at least doubled or quadrupled capacity. (Ideally, the native storage capacity would be 500 gigs or so.)

I think that for most people don't need an awful lot of that. In fact, most home computer backups aren't any bigger than single disks you can buy today, which is 8TB for external disks and 10TB for internals, plus the size of things like Drobos and home focused NAS devices.

But, the thing I want to do is create multiple media sets for my backup of a big server system that I want to then take away for safe keeping. The other thing I want to do is store data in a semi-archival state. External hard disks are big enough, it's easy to duplicate them, but it's also easy to kill them and a certain amount of inactivity will cause them to die.

The other problem with external hard disks for archival is that they cost a lot up front and they're often much larger than the amount of data I want to "archive" at any given moment, and I don't necessarily want to pull my disks out to add data and then later duplicate it.

I think the common thing to do these days if to put data that's being hoarded in fake unlimited cloud storage locations. I suspect that if it were easier to use something better suited to the task, people wouldn't abuse those tools. The key to this is making the device fast enough. It has to be faster than using an average Internet connection to upload files, but it can be slower than a proper hard disk, I think, and making it inexpensive enough to be able to load up on cartridge. It would also be better if as part of an "archiving" solution there was a way to catalog the contents of the media, although if they're not tapes you should also be able to just browse the devices.

I think that at most the mechanism should cost a few hundred dollars, no more than 500 if possible, and the media should be pretty reasonably priced. If in trading off the convenience of flash disks for lower cost of these cartridges, you can get the media down to around $20 a pop, it would start to make a lot of sense for low end archiving and backup applications.

In an ideal configuration, the mechanisms cost a little less and the media is perhaps a little slower, but it holds a lot more in trade for the speed. I think that the "data hoarding" crowd would be fine with something cheap that worked slowly but perhaps worked in a configuration where multiple drives could run in tandem or where there was a cheap loader or stacker would of course be beneficial, but things like that add complexity and cost.

The trouble is that there are a lot of ifs here. At the top end, for people who are building large multi-terabyte disk arrays to alleviate storage problems, you can almost certainly just get a tape drive and eat the cost. At the low end, RDX costs a lot relative to hard disks, but it's a good durable backup option for systems with less than 4TB of storage, and it's a removable system that works with spanning archive. At the far small end, cloud storage systems with a quota of a terabyte or so are often better as a primary or only storage solution, but mistrust in cloud technology often ends up meaning that some people end up with their data stored locally (not a bad thing) with no or insufficient backups.

This technology is not really marketable or something that is likely possible to exist. It conveniently combines the best aspects of LTO and RDX but cheaper than either of them. I think that there is "a market" for this kind of thing but I don't really believe that it's terribly big. In truth, I'm sure it's quite small.

Part of the problem is that there are people who the data hoarder thing. The people who do data hoarding as a hobby often either have the wherewithal to run regular tapes, or are totally opposed to the idea and might not be interested in such a device.

June 19
Surface Laptop Impressions

I finally got the Surface Laptop in, a day after I would have liked to, but I'm not complaining too much, since it seems like the shipping company was ultimately able to accommodate my schedule on the second delivery attempt.

This is totally beautiful hardware. It meets the high standards I've come to put on Surface products, and it's, in general, a great upgrade from the computers I've been using.

As a quick review, I have a Surface 3 in less than stellar condition after I dropped it a few times, a Surface RT that sometimes fills in, and a ThinkPad E520 which I use at home. The idea behind the Surface Laptop was that it should replace the Surface 3 as an on-the-go computer and the E520 as a somewhat powerful computer at home or for longer trips.

I haven't had a chance to let the Surface Laptop really stretch its legs yet, but with what I've done so far, it has been good. The battery life isn't as much as I would have in an ideal world, and I'm still a little burned by Panos Panay's comments about USB Type C, and it's not the most powerful computer you can buy, but it doesn't have to be because it reaches a good compromise of size, build quality, and performance specifications.

I upgraded mine to Windows 10 Professional instantly – not because I don't think I could use 10S, but because I know up front that Pro is better for my needs. I'm not a developer, so I don't need to experience this computer as a flagship development test machine, which is the speculated reason for shipping it with 10S.

The display is great, I can use it reliably at about 125% of its native resolution, which equates to around 1800x1200 of work space (the native resolution is 2256x1504). The wireless networking is much better than the other systems I've been using.

I got the i5/8GB/256GB configuration in a gorgeous blue finish for $1299. I knew up front that I wasn't going to be fine with the 4/128 config. It would work, but I wasn't going to spend $999 on a new computer with the same configuration as not only the Surface 3, but of the ThinkPad T400. It's a configuration that works well, but I know that the web and web-based applications are getting heavier.

As of this writing, the software setup is simple. I installed Office 365 on it, and PuTTYTray. I have yet to set up anything else, but those two things are most of what I do. I initially installed Office 365 via the store, hoping to add the promotional Office 365 time to my existing subscription, but elected to uninstall that and install it normally from 365, mainly because it didn't include the desktop version of OneNote.

Things brings me to one of my complaints about Windows 10 in general. In addition to receiving a lot of software I can't or won't use that counts as (perhaps) paid shovelware, you can't install certain things that feel like they should be optional installs, such as OneNote. I like OneNote a lot, but on any system for which I've licensed Office 365 or a desktop version of Office, I want it to be the full desktop version, with easy handling of the notebooks stored on my personal server.

I've already been asked once or twice about my thoughts on Surface Laptop reparability. The obvious source has already published their teardown, and the gadget sites have talked about how the Surface Laptop can't be disassembled without destroying it. To add, the RAM (obviously) and now the storage (a change from the previous Surface Pros) are soldered on. The problem with this assessment is that the Surface Pro storage, while technically socketed was not in any way reasonable to access. Nobody was pulling apart their Surface Pros to replace the built-in storage with bigger or faster m.2 blades, and the practical effect was that it wasn't upgradeable.

The way I'll put it is this: If I find out that 8/256 isn't sufficient, I don't think that an upgrade to 16/512 would solve whatever the problem is. That kind of discovery is almost certainly going to be accompanied by a need for, say, a quad-core CPU or a beefy discrete graphics processor, or significantly improved i/o capabilities. Even the top end Surface Laptop, which retails for $2199, is still a Surface Laptop with all the limitations implied by a 15-watt dual core CPU, 16 gigs of RAM, and a single USB port. I think if my Surface Laptop is going to be insufficient for something, it's going to be so insufficient that no configuration would have done well.

And, the things I'm doing on this machine are simple enough that I'm not worrying about that happening for a few years. This machine is pretty explicitly for writing and communication. I would likely have ordered the i7/16/512 configuration if I thought I'd need to use Slack, though.

Panos Panay says that the Surface Laptop was designed to be a viable student computer for four years. I believe him, and I'm hoping I can perhaps squeeze a fifth year out of it. I don't know if I can reasonably expect an awful lot more than that, but it really depends on physical longevity in this case, I think.

My ThinkPad T400 was so over-bought that to this day, it's fast enough (with its dual core, 4/128 config) for my research, writing, and communications. The reason I'm not using the T400 is that its display broke, and buying new batteries for it is expensive and impractical. My Surface RT still runs well and gets good battery life, but it was never particularly fast, Internet Explorer 11 on it is nearly unusably slow for blog research. The Surface 3 technically outperforms the T400, but only nominally and only for tasks that are well threaded. The Surface 3 has the problems caused by being dropped – its touchscreen doesn't work, only a part of the display works for pen input, and it gets slow and unstable when running on battery below 30% of its capacity.

The Surface Laptop wasn't over-bought, but it wasn't under-bought. I think the state we are in right now where every Intel computer I have that's less than about ten years old works perfectly fine is indicative of the place we're in with the Surface Laptop. The computing plateau means that unless something game changing comes up or web sites grow too much, this machine will be "a useful computer" long after it stops being "a portable computer."

That said, here's to the next several years of portable computing.

June 12
WWDC 2017 Happened!

WWDC 2017 came and went. A lot of interesting stuff. As a heads up, I didn't pay an awful lot of attention to anything about the Apple Watch or the iPhone. I also haven't watched the video yet.

That said, it was an interesting year!

It transpires that most of what Vittici's wish-list video contained was true. iOS has gained drag-and-drop, a file management app, and the highest end of iPads can now run three applications at once, two in a side-by-side mode and a third floating on top of the other two. (This is essentially primordial windowing.)

There was new iPad hardware which looks relatively exciting, including the hotly rumored 10.5-inch iPad Pro. There was also a new revision of the 12.5-inch iPad Pro.

On the Mac OS X side of things, OS X High Sierra Snow Sierra 10.13 brings APFS to SSD-equipped Macs (nothing for hard disks quite yet) and is being talked about as another quality-of-life release like Snow Leopard. I think what a lot of people realize is that the other [Modifier] [Cat] release was Mountain Lion 10.8 and was a relatively problematic release. It's pretty widely agreed that 10.8 is the low point in Mac OS X history up to this point. (I personally think that the low point is probably 10.7, with 10.6 and 10.8 each being only slightly better, 10.5 and 10.9 were each that much better, but that's sort of splitting hairs.

The Mac hardware was almost all refreshed. Literally the only things left untouched were already the oldest things in the product stack – the Mac Pro and the Mac mini. Even the 2015 MacBook Air had a configuration update (slightly faster CPU, 16GB RAM was removed.)

The real stars of the show, for me, were the acknowledgement of external GPUs via Thunderbolt 3 and the iMac Pro. The new iMacs are a great update and most of the 21.5-inch models at $1299 and above have gained socketed CPUs and memory, but there isn't information on the new $1099 model, which has moved to a slightly better CPU.

The iMac Pro is an interesting machine, Apple has managed to cram an awful lot of CPU and GPU firepower into the machine, and it'll be really interesting to see if it can perform and stay cool.

The Mac mini and the Mac Pro have yet to happen, I'm thinking that we might see the Mini silently updated later on in the year, and the Mac Pro is guaranteed to happen as soon as "next year" but it could be even longer.

June 05
Thoughts Ahead of WWDC

This year's WWDC is highly anticipated. Based on Things I've Seen™ and some of my own wild guessing, I'm going to make a few predictions for what we'll see this coming week:

  • New iPhone
  • New release of iOS
  • New iPad model(s)
  • Refreshed Mac laptops

In general, everybody will be very disappointed for the following reasons:

  • The new iPhone will be 50% faster than the old one and have better everything, except it will not switch to USB Type C cabling and will still have no headphone port.
  • The new release of iOS will feel uninspired and will introduce bugs that will make the iPhone completely unusable for a small looking portion of the community that will feel extremely huge once they start complaining. It will not sufficiently transform the iPad Pro into a laptop replacement.
  • The new iPad will be 50% faster than its predecessor but the new release of iOS won't make that horsepower meaningfully usable for applications that can use that type of horsepower. A low memory ceiling will make web browsing on it painful compared to 2018's new iPad models, despite it being good enough in every other way.
  • The new Mac laptops will have Kaby Lake processors, but will otherwise be identical to the olde models. They might not even update the chipset to a new model. Prices will not meaningfully budge. Apple will discontinue the 2016 lineup but keep the 2015 lineup on sale. There's a non-zero chance the MD101LL/A will be re-introduced.

We'll see how my predictions do. I think that there will be a loud contingent who congratulates Apple on whatever they do. Notably absent will be any mention of the replacement for the Mac Pro, any update for the now-ancient Mac mini, and the supposed pro-focused iMac.

There's a lot of excitement about an iOS concept video that was shown off on one of the Apple blogs. The idea was to make the iPad more productive by introducing functions such as system-wide drag and drop, a finder, and a few other nice things. It would be nice, but I question whether Apple will be able to save the iPad from its sales decline. People who want to do "productivity" are now trending toward Macs again and people who want to do consumption on the go are fine with a big iPhone.

I still see the iPhone and the iPad as different products, but I think most of the people who can get away with using an iPad as their main computer do not strictly need or want the horsepower and cost of something like the iPad Pro. The people who want to use an iPad Pro to prove a point need iOS to be a little more capable, and people who need that functionality today are just buying Macs and Surfaces.

There's lots of rumors about a Siri speaker and I think it'll be interesting. I don't use Siri an awful lot so I don't really know what using it will be like. It's likely to be a lot better for privacy than anything from Google or Amazon, and I bet there will be people who buy them as airplay speakers. I would consider it, especially if Apple makes it compatible with Windows and Macs and with other audio sources.

One item I personally hope for is an updated iPhone SE. The platform that phone is based on is now a few generations old and we're looking forward to either an iPhone 7S or iPhone 8 coming out this year. I believe an updated SE would do well with what appears to be a pretty dedicated fan base for that form factor.

I think the particular era we're in with Apple is that they've just discovered that maybe they don't always get it right, even in the modern era, and that at least one, probably more, of their products doesn't meet their customers' needs in a big way. The other trouble with the Mac line is that the majority of the models on sale are over two years old. I've seen more than one post in "non-techie" corners of the Internet from people who bought a MacBook Air or MacBook Pro or Mac mini and were surprised to find out that it was from 2013, 2014, or 2015.

It's possible, albeit unlikely, that we will see a new MacBook Pro family that reflects this reality, but I think that what will really happen is that Apple, having recognized this issue in late 2016 and discussed it in 2017, will announce a few products that address the problems in 2018. New models will help a lot, and I think Apple needs to take a long hard look at their product family and address what might cause problems for customers. I don't think Apple is selling bad computers, but I do think it's possible that some of the models are needlessly compromised in ways that make the experience or configuration worse for people who buy them.

May 29
Surface Pro (5) Miscellany

I posted last week about the Surface Laptop and the current calculus for the Microsoft Surface family, mainly because I wanted to be correct for at least one day, should something have come up.

The Surface Pro was refreshed on Tuesday. Perhaps the most important thing to say up front is that I don't think it's life changing from a product perspective. The same configurations are available, and the new processor was never going to be a particularly big jump in performance. The physical size of the device is the same, it fits in the same docking stations and uses the same power adapters and has the same other ports as well.

Perhaps the big noteworthy change is that Microsoft now rates the Surface Pro for around 13.5 hours of battery life, which is like the rating on the Surface Laptop. If either of these estimates is as accurate as previous estimates have been (which is to say, I'm estimating real usage will get close to 75% of the stated life) then it's a huge thing.

Battery life is a funny thing. I don't think most people are explicitly interested in, say, a computer that'll run for twenty hours uninterrupted. However, I think most people would view that as a benefit, because it means they can either expect to be able to use it through a work day or on a long flight with no trouble. Another scenario I've seen (and used myself) is that a computer that lasts a long time on battery when you're hypermiling it will also last a fair while when you explicitly do not save energy. In particular, I'm thinking about situations where somebody might do some light gaming on a computer running on battery, or do traditionally "heavy" work such as graphic design or development, perhaps even virtual machines.

In that way, it's sort of interesting to side-step the Surface Pro and go directly to thinking about the possibilities for a new Surface Book with Performance Base, updated to the Kaby Lake architecture. The Surface Book with Performance Base is already rated for about 13.5 hours, so if the improvement from switching to Kaby Lake alone is that much, then it's possible we'll see some huge boosts on something with that much battery.

I like the idea of the Surface Pro a lot, but nothing here is going to make me go get one. For the past several years, I've always had low end surfaces, so I don't feel like I'm missing out, and I feel like the advantages of the Surface Laptop's form factor will outweigh the flexibility I lose from the Surface Pro form factor.

This is all especially true and relevant for me, as I know now after having used the Surface RT and the Surface 3 (with its broken screen that doesn't accept pen or touch input) that I don't use them as tablets too often, so the laptop form factor of the Surface Laptop isn't a detriment to me.

One of the things apparently mentioned at the event, although I haven't watched the video myself, was a USB Type C adapter. There isn't images of it yet, but we know that it's going to connect to the SurfaceConnect port. What we don't know is why. Panos Panay and Microsoft in general pretty clearly don't think USB Type C is ready. It's to the point where when talking about the adapter, Panos framed it as being for people who like dongles, a reference to the fact that to work with existing devices that have built-in cables, computers with USB Type C ports (and nothing else) need to use adapters. The adapters you can generally get aren't too offensive. Google, Apple, as well as numerous third parties all sell reasonable adapters. So, the adapters aren't offensive anyway.

To me, this just looks bad, both because Microsoft is willfully digging in its heels on the issue of using this modern connector and they're framing it as an issue of needing adapters. However, the Surface family of portable computers is already in a bad place on that front, because Mini DisplayPort is by its very nature a port of adapters. The difference, and the trade-off Microsoft appears to be making, is that mostly everybody already has Mini DisplayPort adapters and cables. They've been on Macs long enough, and Macs are common enough in education and corporate environments now that such an adapter is now a common sight on those machines.

From my perspective, the Surfaces need adapters to do many of the same things anyway, and I think Microsoft over-estimates the importance of handing people a USB flash drive, in education particularly, but also any other environment I've seen. When people do use USB ports, there's often a need for more than one, which would make the presence of a Type C port in addition to what Surfaces already have or type C ports instead of Type A and Mini DisplayPort welcome. Just generally I also think it's bad messaging to disparage your customers who "love dongles" so much. Microsoft is building good hardware, but sometimes there are just weird little things you wish they would do differently.

The thing I fear is that in two, perhaps four years from now when people are using notebooks Microsoft claimed loudly would be good for four years, the USB Type C tides will have shifted a lot more and we will feel like our platform-leading systems should have come with ports that were starting to appear on Dells and HPs of all descriptions. You can already buy Type C chargers for phones and computers commonly at retail, and there are already peripherals like external hard disks using USB Type C.

This complaint isn't new, either. The Surface Studio, when new, drew lots of criticism for (among other things) not including USB Type C or ThunderBolt 3, on systems where it would arguably have been very easy to do. I've said this several times, but my other concern remains the future availability of power adapters for the systems, and the fact that SurfaceConnect power supplies are apparel a confusing thing. Surface Pro 3 power supplies floating out in the wild might not charge systems with higher demand, but on the other end of things, the high-watt Surface Book power adapter will only charge one model of the Surfe Thace Book. It (supposedly) won't even charge other Surface Books, you must know and remember which one you have.

I'll probably still buy one. I'll like it a lot and when the issue of Type C really forces itself, I'll be unhappy and then I'll buy the attendant adapters or I'll buy a new machine. The port issue and Microsoft's bad messaging about it is really my only anxiety about Surfaces specifically. I have other anxieties, but those are mainly related to the direction of computing overall. In particular, I worry about what it will be like to use a machine with limited and fixed resources if software continues moving in the direction programs like Slack and Spotify have established. Slack and Spotify run well if you have enough horsepower, but they need that horsepower, even though they perform trivial functions that computers have done well (even at the same time) for over twenty years.

Those apps, and heavy web browsing loads, do better relative to the amount of memory and processor horsepower a system has. Today, I do fine on relatively low end hardware, such as the Surface 3 and some old laptops, but performance today doesn't necessarily mean performance tomorrow, especially in a situation where both my habits and the environment can change.

The fifth generation Surface Pro has meaningful change, but the visible changes feel minor. Even claims related to particular types of performance (I'm thinking about battery life in particular) can be meaningful, but only if the old model didn't meet your needs. The display is supposedly better, the keyboards have more Alcantara, the edges are rounded a little bit, and oh, by the way, the interior of the machine has been completely redesigned.

With the precedent set by the Surface Laptop and the Surface Pro (5), I'm excited for the Surface Book ("2") but I'm not going to wait for it. I think Microsoft is going to introduce it with Kaby Lake, perhaps even if the next Lake is available by the time it's announced, and I think the thing we're waiting for right now is an excuse to have an event and perhaps some new graphics silicon from nVidia, who recently announced a new "low end" discrete graphics chip for laptops, which might or might not be right for a second generation of Surface Book. Surface Studio will be the next thing after that to look for. I expect Kaby Lake, Pascal graphics (in keeping with tradition: Perhaps after Volta is available) and a continued resistance to integrating USB Type C connectors, USB 3.1 ports, or a ThunderBolt 3 controller with the attendant Type C port.

In all, unless the Surface Book or Studio is massively different, I don't think that the calculus has changed all that much.

1 - 10Next
 

 About this blog

 
About this blog
Computer and physical infrastructure, platforms, and ecosystems; reactions to news; and observations from life.