Matt Godden

human : artist

Bring content into view.

Category : geekery

Tech-specific articles.

#FixedItForYou

If you’re a user of Apple’s macOS, and you’re still using macOS 10.13 High Sierra, 10.12 Sierra, or earlier, you might have noticed that iCloud stopped working around April 7th, depending on your time zone.

The Problem:

The symptoms, apart from sync & iCloud Drive not working for the system, or apps that use iCloud, are that you can’t access the iCloud.com website in Safari, while it works fine in Firefox.

Looking into Safari’s Web Inspector, reveals the following:

Going into the iCloud preference pane in System Preferences (which looks like it’s logged in and everything is fine) and attempting to access your Account Details, brings up an error connecting to iCloud.

If you then decided to log out of iCloud, which is about the only troubleshooting technique Apple offers, and you decide to remove iCloud data from your Mac so as to completely clean it out, you will find yourself unable to log back in:

This leaves you without any contacts, calendars, Safari passwords, and probably breaks the ability to use Airdrop and Handoff etc.

So what’s going on?

From the Safari web inspector errors, it looks to me like Apple has broken / made incompatible something in the security certificate used by the iCloud server infrastructure. This was probably in the process of fixing an iCloud outage that had been going on in the days beforehand. Since these versions of macOS aren’t “supported”, one assumes this happened because they weren’t tested.

However, this issue does seem reminiscent of an issue from 2020, when Safari on High Sierra lost the ability to access all of Apple’s web services that ran through idmsa.apple.com (which includes Apple’s discussion forums, iTunes Connect etc). So after a bit of searching, I found the solution as was posted then, and tried it out.

The Solution:

If you go to Apple’s discussion forms, here:

https://discussions.apple.com/thread/251211674?page=3

You’ll see the solution – which involves downloading a new security certificate from Apple, and installing that in your Login keychain.

That fixes the problem.

Instantly.

No rebooting, no nothing. It’s fixed so quickly, that if the next thing you do, is switch to Safari and hit Reload on iCloud.com, or switch to the iCloud Prefpane and hit Account Details, it works immediately.

So, there you are, trillion dollar company, a big problem for a fair chunk of your userbase, just fixed for you, free of charge.

This certificate expires in May, I don’t know what will happen then – if Apple will have fixed things in the meantime, or if you’ll just need to keep replacing these certificates periodically, or if there’s a different certificate you can use that’ll be more permanent. If I find that out, I’ll update this.

EDIT May 21: The certificate expired at 1:45am Australian Eastern Time, and everything broke again, aside from getting Account Details in System Preferences.

Until Apple issues an updated certificate, a temporary workaround is to open Keychain Access, go to Login Keychain. View Menu > Show Expired Certificates. Right click on the  CA 2 – G1 certificate, go to the Trust Section, and set “When Using Certificate” to: Always Trust.

That will fix it instantly.

Edit May 22: It’s broken again, and nothing appears to fix it.

Edit May 23: In Keychain Access, System Keychain, changing the trust settings for GeoTrust global SA to “Always Trust”, fixes the problem instantly.

Edit May 25: Apple PKI issued a new certificate which solves al the problems, and allows you to reverse the Always Trust changes for the expired certificates.

If this helped you, maybe go buy one of my eBooks.


Adventures in Image Processing

So here’s an interesting mystery / conundrum / process I recently went through in trying to create a new workflow.

For reasons or not wanting to subscribe to software, I’m still using the CS5 versions of Adobe Apps for Surfing The Deathline. The original documents are heavily constructed in Photoshop, then all the text, sound effects etc are done in InDesign, which makes them rather non-portable to other solutions.

It had been so long since I’d done a serious update of the books, that I’d forgotten parts of my workflow, and so had started some things from Scratch.

Surfing The Deathline uses .png format images for its pages. Although they take up a lot more space than .jpg versions, they have an advantage of being colour-accurate. A major problem of .jpg is that for images in black & white, a single pixel of colour will shift the white and black values away from their correct tones. So, where you get two pages butting up against each other at the spine, the greys might not match.

InDesign CS5 has no direct png output option, so the workflow is:

  1. PageExporterUtility script to output the pages as individual .pdf files
  2. Convert the .pdfs to .pngs
  3. rename the .pngs and move them to the appropriate EPUB document’s image directory.

I had created an Automator action, which took in the .pdf files, and converted them to .png, and saved them to disk. It took about 3-5 minutes to do all 236 pages.

But there was a problem with he output…

Certain pages, seemed to have red & blue fringing on their text. Going through the .pdf files, it bcame apparent that it was linked to the pages which had a specific masterpage controlling their appearance. Looking at that masterpage, the thing that suggested itself as problematic, was the page number – it was frontmost in the layering stack. So, I deleted and recreated the page number object in the masterpage, applied it to the pages, and reran the .pdf to .png workflow.

Problem solved. Almost.

My large sound effects were still showing red/blue colour fringing. After a couple of days research, it became apparent that this was caused by the system applying sub-pixel antialiasing to the .pdf file during the render.

After experimenting with commandline options for disabling it, I found out there was actually a checkbox for it in the System Preferences app. Unfortunately switching it off makes the system’s display look worse, so what I needed was a way to toggle it off, run the image processing steps, then switch it back on.

After some experimenting, and asking on forums, I was able to get an Applescript that did the job, and add it to my Automator action:

What this does is:

  1. run an Applescript to open System Preferences, test if the checkbox is ticked, untick it if it is, close System Preferences, then,
  2. run all the pdf files through the Render PDF Pages as Images function to create png files, then,
  3. move the converted versions to a new location. Then,
  4. open System Preferences, test the antialiasing setting, and switch it back on, then close System Preferences.

It was fantastic – I had a wonderful system that, once  I’d output the .pdfs from InDesign, could after selecting all of them, render them all to .png in a single right-click.

But there was a problem…

Images which crossed the spine of the EPUB book weren’t aligning correctly. Clearly, something was wrong with the way the Automator renderer was converting .pdf into .png. It didn’t matter what scale I rendered it at – even at the full native 300dpi, the problem remained.

When I compared it against doing the same process manually in Photoshop, it also became apparent that the math behind Automator’s conversion was out – files were always cropped 1-2 pixels smaller from Automator, than they were from Photoshop.

Then I started researching if there was an alternative commandline image processor in macOS – something I could call from an Applescript, to replace Render PDF Pages as Images. Thankfully, there was – SIPS, Scriptable Image Processing System.

After a bunch more research, I managed to sort out the appropriate commands, and gave SIPS a go on my pdf files. The results were the same. I tried it manually with Preview, the results were the same, again. It appears SIPS is the core image processor all these built-in macOS tools use, and it’s SIPS that has the bad math function for rendering PDF files as images.

Sips also produced pretty garbage image quality, compared to Photoshop.

So now I was looking for an alternative to SIPS, and I managed to find one – ImageMagick, a cross-platform commandline image processor. It uses Ghostscript, an opensource alternative to Postscript to render the .pdf, so everything about it will be separate from the SIPS processes. After a couple of days trying to figure out how to install it (hey, opensource projects, try making your basic documentation an educational resource for people who haven’t used your tools previously), I was able to make it work…

It delivered fantastic results, but took 30 seconds per image to process the .pdf files. In contrast, Photoshop, which was so slow I was looking for an alternative, takes 8 seconds.

You might question why I don’t use Affinity Photo, which can tear through the entire 236 pages in around 8 seconds total (gotta love that multithreaded action). Well, unfortunately, Affinity Photo’s pdf renderer can’t handle the edge effects of my InDesign speech bubbles.

So I’m back to where I began, using a ancient versions of Photoshop and InDesign, and needing to take a 35 minute break so Photoshop’s Image Processor script can do its thing, every time I want to run a set of updates from InDesign to EPUB.

Update 23 April 2021:

In experiments with image sizes for Photoshop’s scaling when it renders the PDF file to TIFF, I’ve hit upon a target size that seems to be in some sort of mathematical sweet spot for Photoshop, because the processing time has gone down to about 1 second per image, from ~8 seconds.


Hard Reality.

In April 2016, HTC released the Vive VR headset. Designed in conjunction with games developer Valve, the Vive represented a significant evolution in consumer Virtual Reality.

Technologically,  the Vive’s breakthrough centred around a tracking system that could detect, within a 3x3x3m square volume, the position and orientation of the headset, controllers, and any other object that had a tracking puck attached to it. Crucially, this volumetric tracking ability was included as a default part of the basic kit.

The result, is that HTC’s hardware has effectively defined the minimum viable product for VR as “room scale” – an experience which lets you get out of the chair, and walk around within a defined area. Not only can you look out in all directions, you can physically circumnavigate a virtual object, as if it were a physical object sharing the room. When combined with Valve’s SteamVR platform and store, this has created an entire turnkey hardware and software ecosystem.

From my recent experience of them, the Vive plus Steam is a product, not a tech experiment. This is a tool, not a toy.


First, some basic terminology for the purposes of this article:

  • XR: Extended / Extensible Reality – A blanket term covering all “reality” versions.
  • VR: Virtual Reality – XR in which the real world is completely blocked out, and the user is immersed in a completely computer generated environment.
  • AR: Augmented Reality  XR in which the real world remains visible, directly or via camera feed, and computer generated elements are added, also known as “mediated reality”.
  • GPU: Graphics Processing Unit – the part of a computer that does the work to generate the immersive environment.
  • eGPU: A GPU in an external case, usually connected via Thunderbolt.

More than a year after the Vive’s release, Apple used their 2017 World Wide Developers Conference to announce they were bringing VR to macOS, in a developer preview form.

For those of us in the creative fields who are primarily Mac-based, and have wondered “when can I get this on my Mac?“, Apple’s announcement would seem to be good news. However, there are fundamental differences between Apple’s product philosophy for the Mac, and the needs of VR users and developers. This raises serious concerns as to the basic compatibility of Apple’s product and business model, with the rapidly evolving first decade of this new platform.

Hardware:

When it comes to Apple and VR, the screaming, clownsuit-wearing elephant in the room is this: Apple has weak graphics.

This is the overwhelming sentiment of everyone I have encountered with an interest in VR.

The most powerful GPU in Apple’s product range, AMD’s Vega 64 – with  availability  starting in the AU$8200 configuration of the iMac Pro, is a lowered-performance (but memory expanded) version of a card, which retails for around AU$800, and which is a fourth-tier product in terms of 3D performance, within the wider market.

Note: Adding that card to an iMac Pro, adds AU$960 to the price of the machine, whose price already includes the lower performance Vega 56. In contrast, the actual price difference between a retail Vega 56 and 64 is around AU$200. Effectively, you’re paying full price for both cards, even though Apple only supplies you with one of them.

The VR on Mac blog recent posted an article lamenting “Will we ever really see VR on the Mac?”, to which you can only respond “No, not in any meaningful sense, as long as Apple continues on its current product philosophy”.

To paraphrase Bill Clinton “It’s the GPUs, Stupid”.

When you’re looking at VR performance, what you’re effectively looking at, is the ability of the GPU to drive two high-resolution displays (one for each eye), at a high frame rate, with as many objects rendered at as high a quality as possible. Effectively, you’re looking at gaming performance – unsurprising, given a lot of VR is built on game engines.

Apple’s machines’ (discrete) GPUs are woefully underpowered, and regularly a generation out of date when compared to retail graphics cards for desktop computers, or those available in other brands of laptops.

Most of the presenters at Immerse were using macbooks for their slide decks, but none of the people I met use Apple gear, or seem to have any interest in using Apple gear to do VR, because, as I heard repeatedly, “the Mac has weak graphics”.

How weak is “weak”?

Looking at the GPUs available on the market, in terms of their ability to generate a complicated 3D environment, and render all the objects within that environment in high quality, at the necessary frame rate, here they are, roughly in order of performance, with a price comparison. This price comparison is important, because it represents not just how much it costs to get into VR if you already have a computer, but how much it costs, roughly on an annual schedule, to stay at the cutting edge of VR.

Note: This is excluding Pro GPUs like the Quadro, or Radeon Pro, since they are generally lower performance, in terms of 3D for gaming engines. The “Pro”-named GPUs in Apple’s products are gaming GPUs, and do not include error-correcting memory that is the primary distinguisher of  “Pro” graphics cards.

  • Nvidia Titan V: ~AU$3700. Although not designed as a gaming card, it generally outperforms any gaming card at gaming tasks.
  • Nvidia Titan XP: AU$1950
  • Nvidia 1080ti: ~AU$1100
  • Nvidia 1080 / AMD Vega 64: $AU850 (IF you can get the AMD card in stock)

Realistically, the 1080ti should be considered the entry level for VR. Anything less, and you are not getting an environment of sufficient fidelity that it ceases to be a barrier between yourself, and the work. A 1080 may be a reasonable compromise if you want to do mobile VR in a laptop, but we’re not remotely close to seeing a Vega 64 in a Mac laptop.

So what does this mean?

  • The highest-spec GPU in Apple’s “VR Ready” iMac Pro is a 4th-tier product, and is below the minimum spec any serious content creator should consider for their VR workstation. It’s certainly well below the performance that your potential customers will be able to obtain in a “Gaming PC” that costs a quarter of the price of your “Workstation”.
  • The GPU in the iMac Pro is effectively non-upgradable. The AU$8-20k machine you buy today will fall further behind the leading edge of visual fidelity for VR environments every year. A “Gaming PC” will stay cutting edge for around AU$1200 / year.
  • While Vega 64 is roughly equivalent in performance to Nvidia’s base 1080 (which is significantly lower performance than the 1080ti), in full-fat retail cards, it can require almost double the amount of electricity needed to power the 1080.
  • Apple’s best laptop GPU, the Radeon 560 offers less than half the gaming 3D performance (which again, is effectively VR equivalent) of the mobile 1080, and you can get Windows laptops with dual 1080s in them.
  • Apple is not providing support as yet, for Nvidia cards in eGPU enclosures, and so far only officially supports a single brand and model of AMD card – the Sapphire Radeon RX580 Pulse, which is not a “VR Capable” GPU by any reasonable definition.

The consequences of this are significant.

We’re not going to see performance gains in GPU hardware, and performance requirements for VR plateau any time in the near future. A decade ago, computers were fast enough to do pretty much anything in print production – 300dpi has remained the quality of most print, and paper sizes haven’t changed. That’s not going to happen for VR in the next decade.

GPU progress is not going to hold itself to Apple’s preferred refresh and repurchase cycles for computers. The relationship content producers have with GPUs is, I suspect, going to be similar to the relationship iOS developers have with iPhones & iPads – whatever the top of the range is, they’ll need to have it as soon as it’s released. People aren’t going to turn over a several thousand dollar computer every year, just to get the new GPU.

By Apple’s own admission at WWDC, eGPU is a secondrate option, as compared to a GPU in a slot on the motherboard. A slotted card on the motherboard has potentially four times the bandwidth of a card in an external enclosure. For a user with an 11-13″ microlight laptop, eGPU is a good option to have VR capability at a desk, but it’s not a good solution for desktop computers, or for portable VR.

While Nvidia’s mobile 1080 has been an option in PC laptops for some time now, and offers performance comparable to its full-fat desktop version, AMD (and by extension Apple) seems to have nothing comparable (a mobile Vega 64) on the horizon for Macbooks.

There are, therefore, some really serious questions that need to be asked about the priorities of Apple in using AMD for graphics hardware. Overall, AMD tends to be marginally better for computational GPUs, in other words, GPUs that are used for non-dislay purposes. For realtime 3D environments, Nvidia is significantly ahead, and in mobile, represents  having the capability to to VR at all.

If the balance of computation vs 3D gaming” performance means computation is faster, but VR isn’t possible, then it really starts to feel like back in the days when the iMac went with DVD-ROM while everyone else was building around CD burners.

Software:

Apart from operating system system changes relating to driving the actual VR hardware, Apple’s “embrace of VR” was more or less devoid of content on Apple’s part, in terms of tools for users.

Apple’s biggest announcement regarded adding “VR support” to Final Cut Pro X. As far as I can see, this is about 360 video, not VR. This needs to emphasised – 360 Video is not VR. It shares some superficial similarities, but these are overwhelmed by the fundamental differences:

  • 360 Video is usually not 3D. It’s effectively just video filling your field of vision.
  • 360 Video is a passive medium. While you can look around, you can’t interact with the environment, or move your viewpoint from a fixed location.

In contrast, VR is:

  • a place you go to,
  • a place you move about in, and
  • a place where you do things.

VR is an activity environment, 360 Video is television in which you can only see one third of what is happening, at any one time.

The power of VR is what you can do in it, not what you can see with it.

For example Tvori:

And for a more nuts & bolts vision of actually working in VR:

This is using a 3D VR workspace to create content that will be played on a 2D screen.

This is important – the future of content creation when it comes to VR is NOT going to be based upon using flat screens to create content that can then be viewed on VR goggles. It’s the other way around – we’re going to be using VR toolsets to make content that will be deployed back to 2D platforms.

All of the current development and deployment environments are inherently cross-platform. It’s unlikely that anyone is going to be making macOS-specific VR apps any time in the near future. That’s a self-evident reality – the install base & market for VR-capable Macs is simply too small, and the install base & market for VR-capable PCs too large, to justify not using an application platform that allows for cross-platform applications. VR does not have the problem of a cross-platform app feeling like a secondrate, uncanny-valley facsimile of a native application. In VR, the operating system conveys no “native” UI paradigms, it’s just a launcher, less in fact given that Steam and Viveport handle launching and management of apps – it’s a glorified BIOS.

This is not going to be a replay of iOS, where Apple’s mobile products were undeniably more powerful, and more capable than the majority of the vastly larger market of Android and Windows Mobile devices, and were therefore able to sustain businesses that could ignore other platforms. VR-capable Macs are smaller in market, less-capable as devices due to weak graphics, higher in price to buy, and radically higher in price to maintain relative performance, than VR-capable PCs. As long as this is the case, the Mac will be begging for scraps at a VR table, where Windows (and eventually Linux & SteamOS) will occupy the seats.

The inherent cross-operating-system metaplatform nature of Steam reflects a growing trend within the Pro software market – formerly Mac-only developers are moving their products to be cross-platform, in effect, making their own technologies the platform, and relegating macOS or Windows to little more than a dumb pipe for commoditised hardware management.

One of the recent darlings of the Apple world, Serif, has taken their Affinity suite of design, imaging and publishing apps across to Windows, as have Macphun, who’ve renamed themselves Skylum, and shifted their photography software cross-platform. In the past, developers had marketed their products, based on the degree to which they had embraced Apple’s in-house technologies as the basis of their apps – how “native” their apps were. These days, more and more are emphasising independence from Apple’s technology stack. The presence of the cross-platform lifeboat is becoming more important to customers of Pro apps, than any advantage brought by being “more native”. The pro creative market, by and large, is uncoupling its financial future from Apple’s product strategy. In effect, it’s betting against that strategy.

What does Apple, a company whose core purpose is in creating tasteful, consistent user interface (however debatable that might be these days), have to offer in a world where user environments are the sole domain of the apps themselves, and the operating system is invisible to the user?

Thought exercise, Apple & Gaming:

Video and cinema has always been considered a core market in which Apple had to invest. Gaming (on macOS) has always been a market that Apple fans have been fine with Apple ignoring. The argument has always been about the economics and relative scale of each. It’s worth bearing in mind however, that the size of the games market and industry dwarfs the cinema industry.

Why is it ok amongst Apple fans, Apple-centric media, and shareholders, for Apple to devote resources to making tools for moviemakers / watchers rather than directing it at game developers / players?

When Apple cuts a product, or restricts the versatility of a product under the guise of “focus” there’s no end of people who’ll argue that Apple is focussing on where the profits are. Mac sales are relatively stagnant year over year. Gaming PCs, or as they’d be called if Apple sold them “VR Workstations” have been consistently growing in sales of around 25% year upon year for a while now.

Windows’ gaming focus and games ecosystem, is co-evolutionary with VR. It is the relentless drive to make Windows as good as possible as a gaming platform, that makes it the better VR platform. No amount of optimisation Apple can do with Metal, their 3D infrastructure, can make up for the fact that they’re shipping sub-standard GPUs in their devices.

”High spirits are just no substitute for 800 rounds a minute!”


Apple’s WWDC VR announcements seem to have had very little impact on people who are using, and making with VR now. Noone I spoke to at Immerse seemed particularly excited about the prospect of Apple getting into the space, or seemed to think Apple had anything in particular to offer. If you look at what Apple did to professional photographers by neglecting, and then dumping their Aperture pro photo management solution, without providing a replacement (and no, Photos is not that), that wariness is well-justified.

What Immerse really opened my eyes to, is that VR is very probably a black swan for Apple, who have spent the last 5 years eliminating the very thing that is central to powering VR – motherboard PCI slots, the associated retail-upgradble GPU, and the entire culture of 3D performance focus, from their product philosophy.

VR is an iceberg,  and Apple, no matter how titanic, will not move it. The question is whether the current design, engineering and marketing leadership, who have produced generation upon generation of computers that sacrifice utility and customer-upgradability in the pursuit of smallness, are culturally capable of accepting that fact.


Hey, If you liked reading this, maybe you’d like to donate?


An Encounter with Vive:

Some terminology for the purposes of this article:

  • XR: Extended/Extensible Reality, or possibly just (x)R – an umbrella term covering all forms of simulated and mediated reality. (note: let’s agree to pronounce the “x” as a “z” like xylophone, so XR sounds like the bad guy from The Last Starfighter)
  • VR: Virtual Reality – a form of XR characterised by blocking out of the “real” world, providing a total immersion in a wholly simulated environment.
  • AR: Augmented Reality – XR in which the real world remains visible (either directly, or via a camera feed), and computer generated elements are added to mediate reality. (and sounds like a pirate noise)
  • GPU: Graphics Processing Unit – the card / components that drive the visuals of the VR experience. Usually a dedicated card in desktop computers, but built into the motherboard on many laptops.
  • eGPU: External GPU – A GPU in an external case, usually connected by Thunderbolt to the main system.

A few weeks ago, here in the sticks of regional Australia, we had a little conference day (immerseconf), with internationally practicing artists from all over the country (including the head of HTC Vive in Australia), demoing how various forms of Extended Reality are being used by artists to create content.

Interestingly, while there was a “serious games” (training & education simulations) discussion, traditional entertainment videogames weren’t covered – this was a conference targeted at makers, and the toolsets available to them for creating. This shouldn’t be taken as indicating the experience was dull – delight and joy are inherent to the experience of doing work in VR.

I’ve been reading about and waiting for this tech since the 1980s. Last time I tried it a couple of years ago, the head-mounted display (goggles) was an Oculus devkit, and interaction was via a playstation controller.

I was ill within a minute.

A theory of why this happens, is that it’s a result of lag between moving your head, and seeing the corresponding movement of the virtual world through the goggles. With the Vive, that problem is solved – the viewpoint is stuck fast to your proprioceptive experience of movement. Lag is gone, you are there.

For an artist, the experience of VR marks a division between everything you have done, learned or experienced in art-making prior, and what you are to do afterwards. It is as redefining an experience as postulated in Crosley Bendix’s discussion of the “discovery” of the new primary colour “Squant”.

In my life, I have been literally moved to tears once by a work I saw in an exhibition – Picasso’s “Head of a Woman”. Why? I had  studied this work, part of the canon of historically important constructed sculpture, for years at art school. I’d written essays concerning, and answered slide tests about it. However, every photo I had seen was in black and white. I finally saw it in the flesh at an exhibition, and out of nowhere found myself weeping at the fact that I had never known what colour it was painted. Nothing I had read, or studied, prepared me for the overwhelming emotional impact of meeting it, face to face, and realising that I had not known something as fundamental as its colour.

Of all the great leaps in art making that Picasso was personally involved with, it was his collaboration with Julio González that more or less invented welded steel sculpture. He did this, primarily out of a desire to be able to “take a line for a walk” in three dimensions, to draw with thin metal rod, the only material whose structural strength could span distance without thickness or sagging.

In VR, free-standing, able to walk about with multi-function hand controllers in an entirely simulated, blank environment, I was once again almost in tears at how profound the experience of this tech is for artmaking. One can literally take a line for a walk, twist it, loop it around itself, trace out the topology of knots, zoom out, zoom inside, and see that three dimensional drawing as a physical object, hanging in the air.

The tools I played with were from Google – Blocks, a simple 3d modelling program, and Tilt Brush, a drawing and painting program (which is also a 3d modeller – it just models paint strokes, and so produces flat ribbons of paint, that follow the 3D orientation of the controller when you make them). They’re reasonably primitive compared to traditional 2D painting and modelling apps, but there’s clearly a commercial space for selling tools for VR.


Just watch this. That’s the actual experience  of creating and working in Tilt Brush.

Or this:

Why would you want to use a screen-based 3d modeller?


Speculation, based on Observation:

  • The authoring environment for VR content, is VR.
    • After 3D modelling, or drawing in VR, you’ll never want to model or paint on a screen again. The idea of not having a direct 3d experience while creating just becomes nonsensical. As for Tilt Brush, there’s no 2D equivalent, I’m not sure there’s even a way to think about how Tilt Brush would work in 2d.
    • Don’t think about VR as a way to preview things you make on screen – making things in VR is so compelling, you will want to change the way you work, or change the sort of work you do, to get as much as possible into the immersion.
  • 360 Video is probably going to end up being a niche or gimmick, like 3d television.
    • The very clear sense I have after this encounter, is that 360 Video (which I first saw demonstrated 16 years ago at the QTVR Forum at Macworld New York) is an attempt by an old, established artform (and players within that artform), to annex a new format for itself, regardless of whether it is appropriate for that new format. If all you have is a hammer, you treat everything as if it were a nail.
    • Outside of video art – time-lapses of locations, or documentary, 360 video may be a way to make video skyboxes and motion backgrounds, at least until software can make them more effectively than a film crew can shoot a real location, which if you look at any modern film, it can already do.
    • Video’s monopoly on “real” will not survive the growth in quality of simulation, which carries with it true interactivity. Why experience a 360 video version of surfing, when you can have a photoreal simulated surfing experience, in which you do more than control the direction you’re looking, and can have it on that water planet in Christopher Nolan’s Interstellar?
    • If 360 video fundamentally changes its nature, becoming something in which the narrative progression is reactive to the directed attention of the viewer, perhaps there’s a possibility there, but isn’t that just a video game with the skill tests removed?
    • Otherwise, how do you get a jumpscare to work, if the viewer is never looking in the direction of the monster? Interactivity and moving around within a place is VR’s point. 360 video is about being a fixed point. Think of it as similar to the way focus-pulling and depth of field are fundamentally incompatible with 3d cinema – viewers can struggle against the director’s chosen point of focus, trying to see unfocussed objects they physiologically understand they should be able to “grip” with their eyes and pull into focus.
    • There are also issues with the physics of optics, revolving around how panoramic images are captured, that make stereo separation with 360 video fundamentally problematic.
    • Using VR headsets to screen non-interactive, immersive stereoscopic 3D video (in other words, you only see what the single direction paired cameras are pointed at) would certainly seem to have a future, given the pornography industry has adopted it for the Point Of View genre.
  • VR is a platform, not a peripheral.
    • This is huge – Mac, Windows, Linux – all of these are irrelevant, you’re simply not going to interact with the host OS to any meaningful degree. The operating system of the computer, merely serves as the loader for the VR environment. You’ll have no more cause to interact with macOS or Windows, than you do to interact with your computer’s firmware. Tilt Brush will look like Tilt Brush, regardless of what operating system it is running on. Look at Adobe’s clear strategy to nullify the operating system as a differentiator, and get their users to think of their computers as “Creative Suite Workstations” rather than “Macs or PCs running Creative Suite”. VR will be even moreso.
    • Everything is up for grabs as new paradigms for fundamental control schemes are solidified. Think how revolutionary the first pull-down menu was, that’s the sort of world VR is in. From Blocks and Tilt Brush you can see already, UI paradigms that are perhaps overly-literal. Multi-sided, rotatable physical palettes wrapped around the controllers are in vogue, but why? Why not have the equivalent of a 30″ monitor, offset 45 degrees, full of palettes that appears in response to a button press, then goes away again? Or, why not a literal wheeled toolbox, that follows behind you? The physicality of creating in VR is a very different working experience to sitting at a desk.
  • The GPU is everything.
    • VR computers are just a host system for the GPU (Graphics Card). A non-upgradable GPU, or a system that can’t be traded up for the market retail cost of a GPU is a laughable idea, truly laughable. Once you use one of these systems and see how good it is, but more importantly how much better VR is going to get in the near future in terms of graphical fidelity, and  consider the soon to arrive retina-scale upgrade to headset display densities, the thought of having to replace a whole computer, just to keep cutting edge, I mean it’s just an unthinkably stupid idea.
      • To put that in perspective, no matter what manufacturers claim, Nvidia’s 1080ti is the minimum graphics card you need to create a simple virtual environment of sufficient fidelity that you’d want to spend all day working within. That is the standard you have to show people, so they can think “this is here and I want to use it“.
      • The 1080ti is the second-fastest GPU Nvidia offers in terms of 3D gaming performance, which is the effective measure of how well the immersive environment will perform.
      • The 1080ti is around 30% faster than the fastest performing GPU AMD makes (Vega 64), for a significantly lower power draw and heat output.
      • Numerous developers, including HTC themselves, were demoing on laptops with Nvidia graphics – none of which required eGPUs. HTC’s laptop was subtly lower fidelity than the desktop machines, but not by a lot.
      • AMD graphics cards were nowhere to be seen. Every tower machine (which were bigger than my cheesegrater Classic Mac Pro, and mostly full of empty space) was team green (Nvidia).
  • VR has a huge future in healthcare.
    • Hospitals here are permanently installing Vive trackers in the children’s wards, so bedridden kids can go participate in networked virtual environments with other kids, and not be bored / confronted with the reality of being in hospital.
    • VR is being used for rehabilitation, gamifying physiotherapy rehab exercises for example, to ensure they’re done correctly, and to relieve the monotony of repeat-based therapy.

Food for Thought, AR vs VR:

There is a school of opinion which holds that AR is the “good” version of XR, that VR is a niche for games, that the goggles etc required for immersion makes VR inherently not a thing for the everyperson.

I have a different take on that. I think that AR would seem to be the “good” version of XR, vs full immersion VR, if you’re the sort of person whose socioeconomic status means your life is the sort of life from which you would never want to seek an escape. AR is lovely, if you’re able-bodied, rich, have a nice house, and a job with sufficient seniority that you have your own office and can shut out distraction.

In other words, if you’re employed with any sort of decision-making authority at a large tech company.

If you live in a tiny apartment or room in a sharehouse, or have a disability whose profundity stops you going out to access experiences, or work in a place where you can’t tune out visual distraction, in other words, if your life isn’t already the sort of 1%er fantasy that most people would like to escape to, then perhaps AR isn’t that compelling in comparison.

From that perspective, AR that does not have a “shut out the real world” function isn’t a complete solution – it’s not the whole story.

By the way, saying the goggles are inconvenient – go speak to anyone who does any sort of manual trade work. VR goggles are no more inconvenient than having to wear safety glasses, gloves, steel-capped boots, ear muffs, a respirator, or welding helmet. Just because it’s less convenient than an office worker is used to, doesn’t mean a lot – if I can sketch in 3d before I go out into the welding bay, that’s a huge convenience factor.


So Overall:

My encounter with Vive leaves me with mixed emotions. I am absolutely going to be gearing up for VR. You simply can’t try this tech, and then not move to make art with it. VR is here, and it is now. It is a complete, usable product with both entertainment, and work tools, not an early-access developer preview.

A lot of the coverage I’ve seen of VR, from people who perhaps don’t understand the sheer amount of heavy lifting necessary to drive the experience, centres around ideas like “wait until the PC isn’t required“. That isn’t going to happen, or rather, that’s going to be a sub-standard experience – a better packaging of current smartphone-based VR. The PC to drive VR isn’t going to go away, because the progress to be made in the medium, the complexity and graphical fidelity has so much room for growth that enthusiasts will keep asking for more, and content creators will have to keep up in order to feed that cycle.

Local Australian pricing has the Vive setup for about a thousand dollars, an Nvidia 1080ti for about another thousand, but what to do for a computer to run that rig?

Does Apple have a solution that lets me stay on the Mac, or do I jump to Windows, and begin the inevitable migration of all my Pro software (which is niche enough that it HAS to be cross platform) and production processes across to Windows versions?

Read on in Part 2: Hard Reality


Hey, If you liked reading this, maybe you’d like to donate?


Becoming a Lord of Time…

Disclaimer:

A couple of years after writing this, I found the officially sanctioned way of doing what I’m doing here. It’s based on the Time Machine command-line utilities. I’ve used it since, and it works well.

Here it is: https://www.baligu.com/pondini/TM/B6.html

If you want to see what’s happening while time machine is doing its run, you can try this from the command line:

log stream –style syslog –predicate ‘senderImagePath contains[cd] “TimeMachine”‘ –info

Anyway, back to the original article:

…or, how to make Time Machine treat a duplicated and enlarged source volume as the original, and continue incremental backups.

It’s not supposed to be possible, but after 3 days of research, and multiple 4-8 hour backup sessions, I’ve cracked it – done something that, as far as I can tell, noone else believes can be done, or has documented how to do. If you follow the instructions here, using only free tools, you’ll be able to do it as well.

The Problem:

You have a Time Machine drive handling backup for multiple drives attached to your system. For example:

Boot / User: 500gb, Photos: 900GB.

Right now, your Time Machine backup is around 1.4TB minimum. On a 2TB drive, that leaves you with 600GB for historical versions of files. Every time you change a file in any way, another copy of the file is added to the Time Machine volume.

Let’s assume your Photos drive is 1TB, and you need to move the contents onto a larger drive before it runs out of space.

You plug in a new 2TB drive, format it with the same name, copy the contents of the Photos drive across, remove the old Photos drive.

You let Time Machine do its thing.

Time Machine will treat the new 2TB Photos drive as a different drive from the original, and perform a full backup of the drive, even though the data on it is identical in every way. Using Carbon Copy Cloner, or Disk Utility’s Restore function will not get around this.

Your Time machine storage now requires: Boot / User: 500GB, Photos (old): 900GB, Photos (new): 900GB, for a total of 2300GB. You’ve now got a backup that’s larger than your 2TB Time Machine volume, and importantly, Time Machine will delete all your historical backups in order to make room for what will effectively be two identical copies of most of your photos.

The Cause:

This happens because Time Machine uses the UUID of the drive to identify it. The UUID is assigned to the drive when it is formatted in Disk Utility, it’s effectively random, and is unaffected by changes to the name of a drive. This means you can change the name of your drive without triggering a full backup, it also means the integrity of your Time Machine backups can’t be effected by temporarily plugging in another drive of the same name, even if the contents are mostly identical.

In general, it’s a safety feature, but as above, it has a serious drawback.

The Solution:

In order to make the enlarged drive behave as a continuation of the old one, you have to fool Time Machine into thinking the new drive is the old one. To do this, you have to copy the data correctly, alter the new drive’s UUID to match the original, then alter the original drive’s UUID so it doesn’t conflict.

Tools You’ll Need:

Note: this is how I made it work – some stuff here may not be totally necessary, but as anyone who used SCSI back in the day knows, superstition is an important part of technology.

  1. A separate bootable MacOS drive / partition where you’ll do all the tasks (to avoid the possibility that your normal system will record what you do in the filesystem events record), and in which you’ve switched off Time Machine. We’ll refer to it as “Tools“. Not the (probably internal) drive you use normally, which we’ll refer to as “MyMac“.
  2. The old (nearly full) drive – “Photos” for this example.
  3. The new larger drive.
  4. A USB thumb drive.
  5. A copy of Shirt-Pocket’s SuperDuper.
  6. A plain text file to act as a scratchpad for copying and pasting things (there’s a pre-formatted version of the whole process below, as a cheat-sheet once you read the process and understand what to do).

Method:

  1. Boot your mac from the normal MyMac drive.
    1. Run a Time Machine backup.
    2. Switch off Time Machine in System Preferences.
  2. Reboot to Tools.
    1. Download and install SuperDuper (to Applications on the Tools drive).
    2. Open Disk Utility
      1. Plug in and format the new larger drive as a standard Mac volume (HFS+ Journalled, case insensitive – make it match the drive you want to clone). The name doesn’t matter.
      2. Plug in and format the USB thumb Drive as a standard Mac volume.
      3. Get info on the old Photos drive, and note down its UUID in your text file. (Depending on the version of Disk Utility you have, you might have to get this from the System Information in About This Mac.)
      4. Select your Photos drive on the left, and use the Restore function to copy it onto the new larger drive (probably called “Untitled”).
      5. Wait some hours while it does its thing. Restore uses block copy which ensures the files aren’t touched or changed by being copied.
      6. Note down the UUID of the new version of Photos on your text file. This is so you can ensure it’s changed later on.
    3. Go to Finder, navigate to where you have installed SuperDuper (Applications).
      1. Right click on SuperDuper, choose “Show Package Contents”, then navigate to Contents / MacOS /
      2. Make sure you can see the exec file SDDiskTool.
    4. Open Terminal.
      1. change directory to the MacOS directory above – you’re going to be using SDDiskTool from the command line, but your comand has to be run from within this directory. The easiest way to do this is (the angle brackets are for actions don’t type them) :
        cd <drag the "MacOS" title icon in the titlebar of the window to here> <hit enter>
      2. Get the UUID in encoded form from the original Photos drive:
        ./SDDiskTool -g <drag old Photos source disk from Finder> <enter>
      3. You’ll see a string of characters returned to the next line, before your command prompt. This is the encoded UUID. Important: Copy those characters to your text file for safe keeping. Be careful selecting them, as they’ll run into the name of your machine at the command prompt.
      4. Set the encoded UUID to the new Photos drive:
        sudo ./SDDiskTool -s <paste the encoded uuid> <drag new Photos target disk from Finder> <enter>
      5. Enter the admin password for your Tools drive system Admin account.
    5. Quit Disk Utility (it has to be relaunched to see the new UUID)
    6. Eject BOTH Photos drives in Finder.
    7. Relaunch Disk Utility and mount the new Photos drive.
      1. Copy the UUID to your text file, and check it matches the original Photos drive version.
      2. Unmount the new Photos drive.
      3. Remount the old Photos drive.
    8. Repeat this process to assign the UUID from the USB thumb drive to your old Photos drive, so that when you reboot to MyMac it won’t confuse Time machine (or, to be super safe physically unplug the old photos drive, do step 9, then reboot to Tools and do this step).
    9. Reboot to MyMac and (eject the old Photos drive if you haven’t changed its UUID) then manually trigger a Time Machine backup – you should be able to tell pretty quickly if it’s worked by the size of the backup that’ll be performed. If it’s the same, or larger than your Photos drive, it hasn’t worked. In which case, stop it, delete the .inprogress file from the Backups.backupdb folder on your Time Machine drive, and start the process from scratch – that’s why you kept the encoded UUID numbers, if you’ve overwritten the old Photos drive UUID with that from the thumbdrive, the encoded value is the only way to assign it. If you open Console, go to System Log, All Messages on the left, then fill in the filter search field on the top right with com.apple.backupd prior to triggering the backup, it’ll tell you how much will be backed up from your various drives. If you’ve done a backup immediately prior to this whole process, then the first backup you do afterwards should be negligible.
    10. If it all worked OK, turn Time Machine automatic backups back on.
    11. If it all goes wrong, and you can’t get the UUIDs to copy, or the copying to stick – my suggestion is to go into the Time Machine history browsing interface when you’re on your whole computer in a Finder window, where you can see all your drives, right click on the photos drive in your most recent level of backup, and delete all backups of photos (they’re safe on your old Photos drive). That will then clear off all your photos backups, meaning that there’s space on the drive to put an entire fresh backup of Photos onto the Time Machine drive, without deleting the historical backups from your Boot / User drive. This is basically where you were going to end up in the first place.

Standard Disclaimer: I take no responsibility for you hosing all of your data and backups doing this. It’s working for me, and I pieced it together and adapted it from bits around the web – mostly relating to how SuperDuper handles working with multiple drives. I suggest doing a practice run with a couple of thumbdrives to make sure you can do it properly.

Cheat Sheet:

1 install Superduper
2 launch terminal
3 cd to inside superduper bundle contents / MacOS
4 ./SDDiskTool -g <drag source disk> <enter>
5 copy encoded uuid
6 sudo ./SDDiskTool -s <paste the uuid> <drag target disk> <enter>
7 enter password
8 unmount disk, quit disk utility, relaunch disk utility and remount disk
9 Write & Check UUIDs
 
 Drive:
 Current UUID: XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX <- The drive you're cloning *to*
 Desired UUID: XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX <- The drive you're cloning *from*
 Encoded UUID: XXXXXXXXXXXXXXXX
 Altered UUID: XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX <- Drive you're cloning *to*
                                                       after the process, should match Desired UUID.

Dedicated in part to the memory of the late James “Pondini” Pond.

Thanks to Dave Nanian and Shirt Pocket Software for producing SDDiskTool.

If this article was of use, a donation would help support my projects.


Failure is not a consideration.

Something that I’ve been increasingly feeling, as I get to 23 years within the Apple software and hardware ecosystem, is how often the way Apple’s products function can best be described as “careless“.

Careless, as in “very little care was put into thinking about how this product would function“, or rather, “no care was put into thinking about how this would fail to function“. You see, if you’ve been using Apple products lately, what you’ll notice is that overwhelmingly, they’ve become black boxes – iCloud for example, when it breaks, how do you fix it? Where do you find the canonical copy of your data, to ensure it’s correct? How do you force a device to sync its data?

You don’t. The entire thing is designed on the premise that it works perfectly, and thus, it contains no manual overrides, no diagnostics, no repairs, it’s just a black box. What’s the way to “fix” iCloud problems? Switch it off, and switch it on again.

Seriously.

That’s the way to fix it. Log out of iCloud, and log back in again. Then, spend days finding things that don’t work any more because when you log back in, all the things that were logged out, aren’t automatically reconnected. Facetime, Messages, ApplePay all these things need their own authentication, and nothing is done to automate that process.

Which brings me to my current bugbear – Time Machine.

Time Machine is typical Apple – it claims to work, it claims to be simple, and when it fails, well, why would it ever fail? Why would anyone need to do anything beyond press one button to start it?

My workstation has a boot/user drive, 1TB Photos drive, and a 2TB Time Machine drive, which is redundantly paired to a second 2TB external Time Machine USB drive. The Photos drive is almost full, and its size the effective limiter on how much space is left over in the backup for older versions of my user drive files – so to upgrade it, I also have to upgrade my time machine drive(s).

A new 4TB drive is installed, formatted, and ready to roll, so I follow Apple’s official guide for how to migrate to a larger Time Machine volume, I set the permissions of the new drive, then copy the backup folder from the old drive, to the new one.

Finder begins “Preparing to copy”. Seven hours later, it is still preparing to copy. There is no estimated time remaining, no progress bar, just a number of files which keeps getting higher. I leave my 200+ watt at idle 12 core Xeon workstation running all night, and the next morning check in, expecting to see the copy has finished.

Nope.

What my high-power space heater has been doing, for who knows how long, is sitting, doing nothing, while Finder waits for me to authenticate so the system to elevate its privileges to actually copy the files.

Why the hell did that not happen at the start of the process?

Well, you see there are security implications…

No. No you don’t get to have that excuse. This is a process with an entire working day spent just to get ready to do the task, and then another estimated 20+ hours to do the actual copy, as software developers, you have no right to demand user interaction inside that block of time. Time Machine should never have made it out of the lab, without a dedicated App that handles migrating backups between disks – an app that works reliably, an app that stores all the authentication details that will be needed, and which can work around errors, and continue the backup while logging things that go wrong for the user’s attention. Why is all that required?

Three hours after the copy started, it failed. Finder could not complete the operation because an unknown error occurred error -50.

So, 11 hours wasted.

Now, a second attempt is being made, this time using Disk Utility to restore the old backup drive to the new one, using a block copy. It’s about 1/8th of the way through, and saying 7 hours remaining.


Autopano Giga Wifi Bug

I’ve encountered an interesting bug in Autopano Giga (APG), a product from Kolor, as subsidiary of GoPro.

The Symptom:

Whenever APG is running, it constantly triggers the Mac wifi control software airportd to actively scan for available WIFI networks.

This happens when the system is connected via Ethernet, and not even using WIFI for its network traffic.

This happens despite options to send analytics, and check for beta versions being disabled.

Below, is 10 seconds of captured logfile from OS X’s wifi.log, covering launch, and then immediate quit.

287 lines of logfile in 10 seconds, and that rate continues for as long as the software is running. Once again, this machine is not using WIFI for any actual network traffic, and APG isn’t transmitting any data over this – it’s merely waking up the WIFI system, and telling it to scan the local basestations, and then writing a massive amount of data to multiple logfiles. God only knows what this would do to the battery life on a portable.

Turning off WIFI is the only way to make it stop.

The Cause:

Kolor’s official twitter account suggested to me that this is caused by APG looking for updates.

  • Where’s the setting to turn off checking for updates? I’ve turned off checking for beta versions, that makes no difference.
  • APG attempts to connect to google on quit, which you would only know if you had Little Snitch installed and telling you what apps are trying to phone home behind your back.
  • If it is an update check, why is APG looking for updates, every (approximately) 5 seconds as long as it is running?
  • Why is it powering up the WIFI scan, when the system is set to use Ethernet as its network?

The Horror:

Mon Apr 3 16:49:01.378 IPC: <airportd[54]> ADDED XPC CLIENT CONNECTION [AutopanoGiga (pid=1017, euid=502, egid=20)]
Mon Apr 3 16:49:01.379 Info: <airportd[54]> SCAN request received from pid 1017 (AutopanoGiga) with priority 0
Mon Apr 3 16:49:01.379 <airportd[54]> WARNING: AutopanoGiga (1017) is not entitled for com.apple.wifi.scan, temporarily allowing request with background priority —— all entitlement requirements will be strictly enforced in a future release
Mon Apr 3 16:49:01.380 <kernel> IO80211ScanManager::startScan: Broadcast scan request received from 'airportd' (pid 54) ().
Mon Apr 3 16:49:01.380 <kernel> IO80211ScanManager::startScan: Initiating scan.
Mon Apr 3 16:49:01.570 <kernel> IO80211ScanManager::scanDone: Scheduling cache purge timer in 30 seconds.
Mon Apr 3 16:49:01.570 <kernel> IO80211ScanManager::getScanResult: All scan results returned for 'airportd' (pid 54).
Mon Apr 3 16:49:01.570 Driver Event: <airportd[54]> _bsd_80211_event_callback: SCAN_CACHE_UPDATED (en2)
Mon Apr 3 16:49:01.570 AutoJoin: <airportd[54]> Successful cache-assisted scan request for AutopanoGiga with channels {(
Mon Apr 3 16:49:01.570 <CWChannel: 0x7f9ba8508670> [channelNumber=1(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:01.570 <CWChannel: 0x7f9ba8506410> [channelNumber=2(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:01.570 <CWChannel: 0x7f9ba8508a70> [channelNumber=3(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:01.570 <CWChannel: 0x7f9ba8509ea0> [channelNumber=4(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:01.570 <CWChannel: 0x7f9ba8510260> [channelNumber=5(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:01.570 <CWChannel: 0x7f9ba8512a50> [channelNumber=6(2GHz), channelWidth={20MHz}, active]
Mon Apr 3 16:49:01.570 )} took 0.1901 seconds, returned 2 results
Mon Apr 3 16:49:01.570 Info: <Wi-Fi Menu Extra[289]> scan cache updated
Mon Apr 3 16:49:01.570 <kernel> IO80211ScanManager::startScan: Broadcast scan request received from 'airportd' (pid 54) ().
Mon Apr 3 16:49:01.570 <kernel> IO80211ScanManager::startScan: Initiating scan.
Mon Apr 3 16:49:01.571 Info: <airportd[54]> QUERY SCAN CACHE request received from pid 220 (locationd)
Mon Apr 3 16:49:01.760 <kernel> IO80211ScanManager::getScanResult: All scan results returned for 'airportd' (pid 54).
Mon Apr 3 16:49:01.761 AutoJoin: <airportd[54]> Successful cache-assisted scan request for AutopanoGiga with channels {(
Mon Apr 3 16:49:01.761 <CWChannel: 0x7f9ba850f5e0> [channelNumber=7(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:01.761 <CWChannel: 0x7f9ba8508790> [channelNumber=8(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:01.761 <CWChannel: 0x7f9ba8525b70> [channelNumber=9(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:01.761 <CWChannel: 0x7f9ba850bc80> [channelNumber=10(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:01.761 <CWChannel: 0x7f9ba85270f0> [channelNumber=11(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:01.761 <CWChannel: 0x7f9ba8527030> [channelNumber=36(5GHz), channelWidth={40MHz(+1)}, active]
Mon Apr 3 16:49:01.761 )} took 0.1907 seconds, returned 0 results
Mon Apr 3 16:49:01.761 <kernel> IO80211ScanManager::startScan: Broadcast scan request received from 'airportd' (pid 54) ().
Mon Apr 3 16:49:01.761 <kernel> IO80211ScanManager::startScan: Initiating scan.
Mon Apr 3 16:49:01.973 <kernel> IO80211ScanManager::getScanResult: All scan results returned for 'airportd' (pid 54).
Mon Apr 3 16:49:01.973 AutoJoin: <airportd[54]> Successful cache-assisted scan request for AutopanoGiga with channels {(
Mon Apr 3 16:49:01.973 <CWChannel: 0x7f9ba85084b0> [channelNumber=40(5GHz), channelWidth={40MHz(-1)}, active],
Mon Apr 3 16:49:01.973 <CWChannel: 0x7f9ba850f7b0> [channelNumber=44(5GHz), channelWidth={40MHz(+1)}, active],
Mon Apr 3 16:49:01.973 <CWChannel: 0x7f9ba853ff70> [channelNumber=48(5GHz), channelWidth={40MHz(-1)}, active],
Mon Apr 3 16:49:01.973 <CWChannel: 0x7f9ba851b2a0> [channelNumber=149(5GHz), channelWidth={80MHz}, active],
Mon Apr 3 16:49:01.973 <CWChannel: 0x7f9ba851b6c0> [channelNumber=153(5GHz), channelWidth={80MHz}, active],
Mon Apr 3 16:49:01.973 <CWChannel: 0x7f9ba851bae0> [channelNumber=157(5GHz), channelWidth={80MHz}, active]
Mon Apr 3 16:49:01.973 )} took 0.2124 seconds, returned 1 results
Mon Apr 3 16:49:01.973 Driver Event: <airportd[54]> _bsd_80211_event_callback: SCAN_CACHE_UPDATED (en2)
Mon Apr 3 16:49:01.974 <kernel> IO80211ScanManager::startScan: Broadcast scan request received from 'airportd' (pid 54) ().
Mon Apr 3 16:49:01.974 <kernel> IO80211ScanManager::startScan: Initiating scan.
Mon Apr 3 16:49:01.975 Info: <airportd[54]> QUERY SCAN CACHE request received from pid 220 (locationd)
Mon Apr 3 16:49:02.669 <kernel> IO80211ScanManager::getScanResult: All scan results returned for 'airportd' (pid 54).
Mon Apr 3 16:49:02.669 AutoJoin: <airportd[54]> Successful cache-assisted scan request for AutopanoGiga with channels {(
Mon Apr 3 16:49:02.669 <CWChannel: 0x7f9ba851bf00> [channelNumber=161(5GHz), channelWidth={80MHz}, active],
Mon Apr 3 16:49:02.669 <CWChannel: 0x7f9ba851c320> [channelNumber=165(5GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:02.669 <CWChannel: 0x7f9ba8508c20> [channelNumber=12(2GHz), channelWidth={20MHz}],
Mon Apr 3 16:49:02.670 <CWChannel: 0x7f9ba8528c90> [channelNumber=13(2GHz), channelWidth={20MHz}],
Mon Apr 3 16:49:02.670 <CWChannel: 0x7f9ba850fa90> [channelNumber=52(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:02.670 <CWChannel: 0x7f9ba850e990> [channelNumber=56(5GHz), channelWidth={40MHz(-1)}, DFS]
Mon Apr 3 16:49:02.670 )} took 0.6960 seconds, returned 0 results
Mon Apr 3 16:49:02.670 Info: <Wi-Fi Menu Extra[289]> scan cache updated
Mon Apr 3 16:49:02.670 <kernel> IO80211ScanManager::startScan: Broadcast scan request received from 'airportd' (pid 54) ().
Mon Apr 3 16:49:02.670 <kernel> IO80211ScanManager::startScan: Initiating scan.
Mon Apr 3 16:49:03.589 <kernel> IO80211ScanManager::getScanResult: All scan results returned for 'airportd' (pid 54).
Mon Apr 3 16:49:03.589 AutoJoin: <airportd[54]> Successful cache-assisted scan request for AutopanoGiga with channels {(
Mon Apr 3 16:49:03.589 <CWChannel: 0x7f9ba8529710> [channelNumber=60(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:03.589 <CWChannel: 0x7f9ba8529b30> [channelNumber=64(5GHz), channelWidth={40MHz(-1)}, DFS],
Mon Apr 3 16:49:03.589 <CWChannel: 0x7f9ba8529f50> [channelNumber=100(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:03.589 <CWChannel: 0x7f9ba8511e30> [channelNumber=104(5GHz), channelWidth={40MHz(-1)}, DFS],
Mon Apr 3 16:49:03.589 <CWChannel: 0x7f9ba850fd50> [channelNumber=108(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:03.589 <CWChannel: 0x7f9ba850ee50> [channelNumber=112(5GHz), channelWidth={40MHz(-1)}, DFS]
Mon Apr 3 16:49:03.589 )} took 0.9198 seconds, returned 0 results
Mon Apr 3 16:49:03.590 <kernel> IO80211ScanManager::startScan: Broadcast scan request received from 'airportd' (pid 54) ().
Mon Apr 3 16:49:03.590 <kernel> IO80211ScanManager::startScan: Initiating scan.
Mon Apr 3 16:49:04.512 <kernel> IO80211ScanManager::getScanResult: All scan results returned for 'airportd' (pid 54).
Mon Apr 3 16:49:04.512 AutoJoin: <airportd[54]> Successful cache-assisted scan request for AutopanoGiga with channels {(
Mon Apr 3 16:49:04.512 <CWChannel: 0x7f9ba850f270> [channelNumber=116(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:04.512 <CWChannel: 0x7f9ba850dfe0> [channelNumber=120(5GHz), channelWidth={40MHz(-1)}, DFS],
Mon Apr 3 16:49:04.512 <CWChannel: 0x7f9ba850e400> [channelNumber=124(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:04.512 <CWChannel: 0x7f9ba8519cb0> [channelNumber=128(5GHz), channelWidth={40MHz(-1)}, DFS],
Mon Apr 3 16:49:04.513 <CWChannel: 0x7f9ba851a0d0> [channelNumber=132(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:04.513 <CWChannel: 0x7f9ba851a4f0> [channelNumber=136(5GHz), channelWidth={40MHz(-1)}, DFS]
Mon Apr 3 16:49:04.513 )} took 0.9229 seconds, returned 0 results
Mon Apr 3 16:49:04.513 Info: <Wi-Fi Menu Extra[289]> scan cache updated
Mon Apr 3 16:49:04.513 <kernel> IO80211ScanManager::startScan: Broadcast scan request received from 'airportd' (pid 54) ().
Mon Apr 3 16:49:04.513 <kernel> IO80211ScanManager::startScan: Initiating scan.
Mon Apr 3 16:49:04.790 <kernel> IO80211ScanManager::getScanResult: All scan results returned for 'airportd' (pid 54).
Mon Apr 3 16:49:04.790 AutoJoin: <airportd[54]> Successful cache-assisted scan request for AutopanoGiga with channels {(
Mon Apr 3 16:49:04.790 <CWChannel: 0x7f9ba851a910> [channelNumber=140(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:04.790 <CWChannel: 0x7f9ba85123b0> [channelNumber=144(5GHz), channelWidth={40MHz(-1)}, DFS]
Mon Apr 3 16:49:04.790 )} took 0.2771 seconds, returned 0 results
Mon Apr 3 16:49:04.799 IPC: <airportd[54]> INVALIDATED XPC CLIENT CONNECTION [AutopanoGiga (pid=1017, euid=502, egid=20)]
Mon Apr 3 16:49:06.570 <kernel> IO80211ScanManager::startScan: Broadcast scan request received from 'airportd' (pid 54) ().
Mon Apr 3 16:49:06.570 <kernel> IO80211ScanManager::getScanResult: All scan results returned for 'airportd' (pid 54).
Mon Apr 3 16:49:11.244 IPC: <airportd[54]> ADDED XPC CLIENT CONNECTION [AutopanoGiga (pid=1017, euid=502, egid=20)]
Mon Apr 3 16:49:11.245 Info: <airportd[54]> SCAN request received from pid 1017 (AutopanoGiga) with priority 0
Mon Apr 3 16:49:11.246 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 1 does not require a live scan
Mon Apr 3 16:49:11.246 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 2 does not require a live scan
Mon Apr 3 16:49:11.246 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 3 does not require a live scan
Mon Apr 3 16:49:11.246 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 4 does not require a live scan
Mon Apr 3 16:49:11.246 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 5 does not require a live scan
Mon Apr 3 16:49:11.246 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 6 does not require a live scan
Mon Apr 3 16:49:11.246 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga does not require a live scan
Mon Apr 3 16:49:11.246 AutoJoin: <airportd[54]> Successful cache-assisted scan request for AutopanoGiga with channels {(
Mon Apr 3 16:49:11.246 <CWChannel: 0x7f9ba890fa00> [channelNumber=1(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:11.246 <CWChannel: 0x7f9ba8912810> [channelNumber=2(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:11.246 <CWChannel: 0x7f9ba890fa60> [channelNumber=3(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:11.246 <CWChannel: 0x7f9ba8912800> [channelNumber=4(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:11.246 <CWChannel: 0x7f9ba890fea0> [channelNumber=5(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:11.246 <CWChannel: 0x7f9ba8916fe0> [channelNumber=6(2GHz), channelWidth={20MHz}, active]
Mon Apr 3 16:49:11.246 )} took 0.0004 seconds, returned 2 results
Mon Apr 3 16:49:11.246 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 7 does not require a live scan
Mon Apr 3 16:49:11.246 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 8 does not require a live scan
Mon Apr 3 16:49:11.246 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 9 does not require a live scan
Mon Apr 3 16:49:11.246 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 10 does not require a live scan
Mon Apr 3 16:49:11.246 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 11 does not require a live scan
Mon Apr 3 16:49:11.246 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 36 does not require a live scan
Mon Apr 3 16:49:11.246 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga does not require a live scan
Mon Apr 3 16:49:11.246 AutoJoin: <airportd[54]> Successful cache-assisted scan request for AutopanoGiga with channels {(
Mon Apr 3 16:49:11.246 <CWChannel: 0x7f9ba89034d0> [channelNumber=7(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:11.246 <CWChannel: 0x7f9ba89035a0> [channelNumber=8(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:11.246 <CWChannel: 0x7f9ba8916590> [channelNumber=9(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:11.246 <CWChannel: 0x7f9ba8918220> [channelNumber=10(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:11.246 <CWChannel: 0x7f9ba890eab0> [channelNumber=11(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:11.247 <CWChannel: 0x7f9ba891ba00> [channelNumber=36(5GHz), channelWidth={40MHz(+1)}, active]
Mon Apr 3 16:49:11.247 )} took 0.0002 seconds, returned 0 results
Mon Apr 3 16:49:11.247 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 40 does not require a live scan
Mon Apr 3 16:49:11.247 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 44 does not require a live scan
Mon Apr 3 16:49:11.247 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 48 does not require a live scan
Mon Apr 3 16:49:11.247 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 149 does not require a live scan
Mon Apr 3 16:49:11.247 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 153 does not require a live scan
Mon Apr 3 16:49:11.247 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 157 does not require a live scan
Mon Apr 3 16:49:11.247 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga does not require a live scan
Mon Apr 3 16:49:11.247 AutoJoin: <airportd[54]> Successful cache-assisted scan request for AutopanoGiga with channels {(
Mon Apr 3 16:49:11.247 <CWChannel: 0x7f9ba891be20> [channelNumber=40(5GHz), channelWidth={40MHz(-1)}, active],
Mon Apr 3 16:49:11.247 <CWChannel: 0x7f9ba8910220> [channelNumber=44(5GHz), channelWidth={40MHz(+1)}, active],
Mon Apr 3 16:49:11.247 <CWChannel: 0x7f9ba8910640> [channelNumber=48(5GHz), channelWidth={40MHz(-1)}, active],
Mon Apr 3 16:49:11.247 <CWChannel: 0x7f9ba8927d90> [channelNumber=149(5GHz), channelWidth={80MHz}, active],
Mon Apr 3 16:49:11.247 <CWChannel: 0x7f9ba89281b0> [channelNumber=153(5GHz), channelWidth={80MHz}, active],
Mon Apr 3 16:49:11.247 <CWChannel: 0x7f9ba89285d0> [channelNumber=157(5GHz), channelWidth={80MHz}, active]
Mon Apr 3 16:49:11.247 )} took 0.0003 seconds, returned 1 results
Mon Apr 3 16:49:11.247 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 161 does not require a live scan
Mon Apr 3 16:49:11.247 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 165 does not require a live scan
Mon Apr 3 16:49:11.247 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 12 does not require a live scan
Mon Apr 3 16:49:11.247 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 13 does not require a live scan
Mon Apr 3 16:49:11.247 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 52 does not require a live scan
Mon Apr 3 16:49:11.247 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 56 does not require a live scan
Mon Apr 3 16:49:11.247 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga does not require a live scan
Mon Apr 3 16:49:11.247 AutoJoin: <airportd[54]> Successful cache-assisted scan request for AutopanoGiga with channels {(
Mon Apr 3 16:49:11.247 <CWChannel: 0x7f9ba89289f0> [channelNumber=161(5GHz), channelWidth={80MHz}, active],
Mon Apr 3 16:49:11.247 <CWChannel: 0x7f9ba8928e10> [channelNumber=165(5GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:11.247 <CWChannel: 0x7f9ba890eba0> [channelNumber=12(2GHz), channelWidth={20MHz}],
Mon Apr 3 16:49:11.247 <CWChannel: 0x7f9ba891b8a0> [channelNumber=13(2GHz), channelWidth={20MHz}],
Mon Apr 3 16:49:11.247 <CWChannel: 0x7f9ba8910a60> [channelNumber=52(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:11.247 <CWChannel: 0x7f9ba8910f50> [channelNumber=56(5GHz), channelWidth={40MHz(-1)}, DFS]
Mon Apr 3 16:49:11.247 )} took 0.0002 seconds, returned 0 results
Mon Apr 3 16:49:11.247 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 60 does not require a live scan
Mon Apr 3 16:49:11.247 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 64 does not require a live scan
Mon Apr 3 16:49:11.247 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 100 does not require a live scan
Mon Apr 3 16:49:11.247 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 104 does not require a live scan
Mon Apr 3 16:49:11.247 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 108 does not require a live scan
Mon Apr 3 16:49:11.247 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 112 does not require a live scan
Mon Apr 3 16:49:11.247 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga does not require a live scan
Mon Apr 3 16:49:11.247 AutoJoin: <airportd[54]> Successful cache-assisted scan request for AutopanoGiga with channels {(
Mon Apr 3 16:49:11.247 <CWChannel: 0x7f9ba89241b0> [channelNumber=60(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:11.247 <CWChannel: 0x7f9ba89245d0> [channelNumber=64(5GHz), channelWidth={40MHz(-1)}, DFS],
Mon Apr 3 16:49:11.247 <CWChannel: 0x7f9ba89249f0> [channelNumber=100(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:11.248 <CWChannel: 0x7f9ba8924e10> [channelNumber=104(5GHz), channelWidth={40MHz(-1)}, DFS],
Mon Apr 3 16:49:11.248 <CWChannel: 0x7f9ba8910d20> [channelNumber=108(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:11.248 <CWChannel: 0x7f9ba8925720> [channelNumber=112(5GHz), channelWidth={40MHz(-1)}, DFS]
Mon Apr 3 16:49:11.248 )} took 0.0002 seconds, returned 0 results
Mon Apr 3 16:49:11.248 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 116 does not require a live scan
Mon Apr 3 16:49:11.248 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 120 does not require a live scan
Mon Apr 3 16:49:11.248 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 124 does not require a live scan
Mon Apr 3 16:49:11.248 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 128 does not require a live scan
Mon Apr 3 16:49:11.248 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 132 does not require a live scan
Mon Apr 3 16:49:11.248 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 136 does not require a live scan
Mon Apr 3 16:49:11.248 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga does not require a live scan
Mon Apr 3 16:49:11.248 AutoJoin: <airportd[54]> Successful cache-assisted scan request for AutopanoGiga with channels {(
Mon Apr 3 16:49:11.248 <CWChannel: 0x7f9ba8925b40> [channelNumber=116(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:11.248 <CWChannel: 0x7f9ba8925f60> [channelNumber=120(5GHz), channelWidth={40MHz(-1)}, DFS],
Mon Apr 3 16:49:11.248 <CWChannel: 0x7f9ba8926380> [channelNumber=124(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:11.248 <CWChannel: 0x7f9ba89267a0> [channelNumber=128(5GHz), channelWidth={40MHz(-1)}, DFS],
Mon Apr 3 16:49:11.248 <CWChannel: 0x7f9ba8926bc0> [channelNumber=132(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:11.248 <CWChannel: 0x7f9ba8926fe0> [channelNumber=136(5GHz), channelWidth={40MHz(-1)}, DFS]
Mon Apr 3 16:49:11.248 )} took 0.0002 seconds, returned 0 results
Mon Apr 3 16:49:11.248 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 140 does not require a live scan
Mon Apr 3 16:49:11.248 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 144 does not require a live scan
Mon Apr 3 16:49:11.248 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga does not require a live scan
Mon Apr 3 16:49:11.248 AutoJoin: <airportd[54]> Successful cache-assisted scan request for AutopanoGiga with channels {(
Mon Apr 3 16:49:11.248 <CWChannel: 0x7f9ba8927400> [channelNumber=140(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:11.248 <CWChannel: 0x7f9ba8925390> [channelNumber=144(5GHz), channelWidth={40MHz(-1)}, DFS]
Mon Apr 3 16:49:11.248 )} took 0.0001 seconds, returned 0 results
Mon Apr 3 16:49:11.256 IPC: <airportd[54]> INVALIDATED XPC CLIENT CONNECTION [AutopanoGiga (pid=1017, euid=502, egid=20)]
Mon Apr 3 16:49:11.276 IPC: <airportd[54]> ADDED XPC CLIENT CONNECTION [AutopanoGiga (pid=1017, euid=502, egid=20)]
Mon Apr 3 16:49:11.276 Info: <airportd[54]> SCAN request received from pid 1017 (AutopanoGiga) with priority 0
Mon Apr 3 16:49:11.277 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 1 does not require a live scan
Mon Apr 3 16:49:11.277 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 2 does not require a live scan
Mon Apr 3 16:49:11.277 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 3 does not require a live scan
Mon Apr 3 16:49:11.277 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 4 does not require a live scan
Mon Apr 3 16:49:11.277 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 5 does not require a live scan
Mon Apr 3 16:49:11.277 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 6 does not require a live scan
Mon Apr 3 16:49:11.277 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga does not require a live scan
Mon Apr 3 16:49:11.277 AutoJoin: <airportd[54]> Successful cache-assisted scan request for AutopanoGiga with channels {(
Mon Apr 3 16:49:11.277 <CWChannel: 0x7f9babe22710> [channelNumber=1(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:11.277 <CWChannel: 0x7f9babe07cb0> [channelNumber=2(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:11.277 <CWChannel: 0x7f9babe1bf20> [channelNumber=3(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:11.277 <CWChannel: 0x7f9babe07ca0> [channelNumber=4(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:11.277 <CWChannel: 0x7f9babe00690> [channelNumber=5(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:11.277 <CWChannel: 0x7f9babe0daf0> [channelNumber=6(2GHz), channelWidth={20MHz}, active]
Mon Apr 3 16:49:11.277 )} took 0.0003 seconds, returned 2 results
Mon Apr 3 16:49:11.277 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 7 does not require a live scan
Mon Apr 3 16:49:11.277 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 8 does not require a live scan
Mon Apr 3 16:49:11.277 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 9 does not require a live scan
Mon Apr 3 16:49:11.277 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 10 does not require a live scan
Mon Apr 3 16:49:11.277 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 11 does not require a live scan
Mon Apr 3 16:49:11.277 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 36 does not require a live scan
Mon Apr 3 16:49:11.277 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga does not require a live scan
Mon Apr 3 16:49:11.277 AutoJoin: <airportd[54]> Successful cache-assisted scan request for AutopanoGiga with channels {(
Mon Apr 3 16:49:11.277 <CWChannel: 0x7f9babe0e1a0> [channelNumber=7(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:11.277 <CWChannel: 0x7f9babe22050> [channelNumber=8(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:11.277 <CWChannel: 0x7f9babe1dc50> [channelNumber=9(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:11.277 <CWChannel: 0x7f9babe19080> [channelNumber=10(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:11.277 <CWChannel: 0x7f9babe16d80> [channelNumber=11(2GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:11.277 <CWChannel: 0x7f9babe30630> [channelNumber=36(5GHz), channelWidth={40MHz(+1)}, active]
Mon Apr 3 16:49:11.277 )} took 0.0002 seconds, returned 0 results
Mon Apr 3 16:49:11.278 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 40 does not require a live scan
Mon Apr 3 16:49:11.278 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 44 does not require a live scan
Mon Apr 3 16:49:11.278 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 48 does not require a live scan
Mon Apr 3 16:49:11.278 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 149 does not require a live scan
Mon Apr 3 16:49:11.278 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 153 does not require a live scan
Mon Apr 3 16:49:11.278 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 157 does not require a live scan
Mon Apr 3 16:49:11.278 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga does not require a live scan
Mon Apr 3 16:49:11.278 AutoJoin: <airportd[54]> Successful cache-assisted scan request for AutopanoGiga with channels {(
Mon Apr 3 16:49:11.278 <CWChannel: 0x7f9babe0c660> [channelNumber=40(5GHz), channelWidth={40MHz(-1)}, active],
Mon Apr 3 16:49:11.278 <CWChannel: 0x7f9babe2c1f0> [channelNumber=44(5GHz), channelWidth={40MHz(+1)}, active],
Mon Apr 3 16:49:11.278 <CWChannel: 0x7f9babe2c370> [channelNumber=48(5GHz), channelWidth={40MHz(-1)}, active],
Mon Apr 3 16:49:11.278 <CWChannel: 0x7f9babe1c360> [channelNumber=149(5GHz), channelWidth={80MHz}, active],
Mon Apr 3 16:49:11.278 <CWChannel: 0x7f9babe25140> [channelNumber=153(5GHz), channelWidth={80MHz}, active],
Mon Apr 3 16:49:11.278 <CWChannel: 0x7f9babe0acc0> [channelNumber=157(5GHz), channelWidth={80MHz}, active]
Mon Apr 3 16:49:11.278 )} took 0.0002 seconds, returned 1 results
Mon Apr 3 16:49:11.278 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 161 does not require a live scan
Mon Apr 3 16:49:11.278 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 165 does not require a live scan
Mon Apr 3 16:49:11.278 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 12 does not require a live scan
Mon Apr 3 16:49:11.278 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 13 does not require a live scan
Mon Apr 3 16:49:11.278 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 52 does not require a live scan
Mon Apr 3 16:49:11.278 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 56 does not require a live scan
Mon Apr 3 16:49:11.278 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga does not require a live scan
Mon Apr 3 16:49:11.278 AutoJoin: <airportd[54]> Successful cache-assisted scan request for AutopanoGiga with channels {(
Mon Apr 3 16:49:11.278 <CWChannel: 0x7f9babe07140> [channelNumber=161(5GHz), channelWidth={80MHz}, active],
Mon Apr 3 16:49:11.278 <CWChannel: 0x7f9babe30cb0> [channelNumber=165(5GHz), channelWidth={20MHz}, active],
Mon Apr 3 16:49:11.278 <CWChannel: 0x7f9babe29350> [channelNumber=12(2GHz), channelWidth={20MHz}],
Mon Apr 3 16:49:11.278 <CWChannel: 0x7f9babe06d10> [channelNumber=13(2GHz), channelWidth={20MHz}],
Mon Apr 3 16:49:11.278 <CWChannel: 0x7f9babe31430> [channelNumber=52(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:11.278 <CWChannel: 0x7f9babe08830> [channelNumber=56(5GHz), channelWidth={40MHz(-1)}, DFS]
Mon Apr 3 16:49:11.278 )} took 0.0002 seconds, returned 0 results
Mon Apr 3 16:49:11.278 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 60 does not require a live scan
Mon Apr 3 16:49:11.278 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 64 does not require a live scan
Mon Apr 3 16:49:11.278 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 100 does not require a live scan
Mon Apr 3 16:49:11.278 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 104 does not require a live scan
Mon Apr 3 16:49:11.278 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 108 does not require a live scan
Mon Apr 3 16:49:11.278 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 112 does not require a live scan
Mon Apr 3 16:49:11.278 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga does not require a live scan
Mon Apr 3 16:49:11.278 AutoJoin: <airportd[54]> Successful cache-assisted scan request for AutopanoGiga with channels {(
Mon Apr 3 16:49:11.278 <CWChannel: 0x7f9babe2afd0> [channelNumber=60(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:11.278 <CWChannel: 0x7f9babe1dfb0> [channelNumber=64(5GHz), channelWidth={40MHz(-1)}, DFS],
Mon Apr 3 16:49:11.278 <CWChannel: 0x7f9babe29500> [channelNumber=100(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:11.278 <CWChannel: 0x7f9babe0b4d0> [channelNumber=104(5GHz), channelWidth={40MHz(-1)}, DFS],
Mon Apr 3 16:49:11.278 <CWChannel: 0x7f9babe2eb80> [channelNumber=108(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:11.278 <CWChannel: 0x7f9babe296a0> [channelNumber=112(5GHz), channelWidth={40MHz(-1)}, DFS]
Mon Apr 3 16:49:11.278 )} took 0.0002 seconds, returned 0 results
Mon Apr 3 16:49:11.278 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 116 does not require a live scan
Mon Apr 3 16:49:11.279 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 120 does not require a live scan
Mon Apr 3 16:49:11.279 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 124 does not require a live scan
Mon Apr 3 16:49:11.279 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 128 does not require a live scan
Mon Apr 3 16:49:11.279 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 132 does not require a live scan
Mon Apr 3 16:49:11.279 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 136 does not require a live scan
Mon Apr 3 16:49:11.279 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga does not require a live scan
Mon Apr 3 16:49:11.279 AutoJoin: <airportd[54]> Successful cache-assisted scan request for AutopanoGiga with channels {(
Mon Apr 3 16:49:11.279 <CWChannel: 0x7f9babe0b750> [channelNumber=116(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:11.279 <CWChannel: 0x7f9babe15ef0> [channelNumber=120(5GHz), channelWidth={40MHz(-1)}, DFS],
Mon Apr 3 16:49:11.279 <CWChannel: 0x7f9babe0d2d0> [channelNumber=124(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:11.279 <CWChannel: 0x7f9babe07770> [channelNumber=128(5GHz), channelWidth={40MHz(-1)}, DFS],
Mon Apr 3 16:49:11.279 <CWChannel: 0x7f9babe1a9d0> [channelNumber=132(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:11.279 <CWChannel: 0x7f9babe2f5f0> [channelNumber=136(5GHz), channelWidth={40MHz(-1)}, DFS]
Mon Apr 3 16:49:11.279 )} took 0.0002 seconds, returned 0 results
Mon Apr 3 16:49:11.279 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 140 does not require a live scan
Mon Apr 3 16:49:11.279 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga on channel 144 does not require a live scan
Mon Apr 3 16:49:11.279 Scan: <airportd[54]> Cache-assisted scan request for AutopanoGiga does not require a live scan
Mon Apr 3 16:49:11.279 AutoJoin: <airportd[54]> Successful cache-assisted scan request for AutopanoGiga with channels {(
Mon Apr 3 16:49:11.279 <CWChannel: 0x7f9babe20790> [channelNumber=140(5GHz), channelWidth={40MHz(+1)}, DFS],
Mon Apr 3 16:49:11.279 <CWChannel: 0x7f9babe2ffb0> [channelNumber=144(5GHz), channelWidth={40MHz(-1)}, DFS]
Mon Apr 3 16:49:11.279 )} took 0.0001 seconds, returned 0 results
Mon Apr 3 16:49:11.287 IPC: <airportd[54]> INVALIDATED XPC CLIENT CONNECTION [AutopanoGiga (pid=1017, euid=502, egid=20)]

Anachronism

The anachronism of using wood to create a case for a technological item, is perhaps blunted by the use of kevlar in the laminate. It provides scratch protection when placing the phone up or down on a flat surface, especially for the camera lens. A big plus is the minimalist logo on the back – too many cases have the casemaker’s name in huge letters.

While the packaging states it’s for the iPhone 5/5s, it’s also fine with the SE.


Apple Keynote Frustrations

Apple’s Keynote is an app that I’ve enjoyed using for years. It brings a lot of power and polish for a low effort, and reminds me of the thrill I  had when first using Macromedia’s Director 8, upon discovering how much of the app’s abilities were available without using any form of scripting language.

Yesterday, however I discovered a coupe of really serious gotchas that reveal some major limitations with the current software.

Presenter View

Presenter view is an option, which on a multiple display system, allows one screen to show the current slide, and the other screen to show the speaker’s notes, a timer, and the current slide.

A problem surfaces when you want to use your iPad to show your presenter notes, while the presentation itself is being run off a different device. For example, consider a Pecha-Kucha presentation, where you have 20 seconds per slide, 20 slides, auto advancing, and the slides are being run off a central slidedeck on a laptop.

Keynote for iPad won’t show presenter view unless an external display is connected.

Even if you bring along an AppleTV, and set up screen mirroring to it, unless that AppleTV is plugged into a display or projector, you’re out of luck. I’ve heard tell that you can set up Keynote on the iPad to be a remote for keynote on an iPhone, running the presentation on your phone, but that’s still a workaround.

Keynote for iPad needs an update to allow Presenter View to run, without an external display connected – if for no other reason than to allow you to practice your talk.

Mac and iPad

With the launch of iCloud-ified versions of Keynote for Mac and iPad, a lot of features were shed, towards the goal of creating documents that are equally at home in either Mac or iPad. Great, I can get on board with this. The only problem is that it isn’t a complete process, and yesterday, while doing the techtest for a presentation I was going to give, I was bitten HARD by this.

So here is the workspace UI for Keynote for Mac. What’s important to note is that palette on the right, with 3 tabs:

  • Build In
  • Action
  • Build Out

The way it works is that when a slide loads, the Build In settings are run to create the slide, then the Action settings, then Build Out, and finally the Transition effect when leaving the slide itself.

In order to get around the problem above of not having Presenter View available, I created a new slideshow, with a goal of it running in time to the Pecha-Kucha slidedeck, that was essentially what you see in Presenter View.

Since you only have 20 seconds per slide, I wanted a countdown timer for each slide, but rather than using numbers, which are visually distracting, I decided a simple graphical solution would be better. Thus, a blue bar across the top of the slide, and next to the bullet points, which disappears off screen to the right in the former, and downwards past the various bullet points in the latter. This was done with an Action, known as Move, which allows you to set the endpoint and duration of the movement.

Then, with the slide set to auto transition after zero delay, you have a 20 second presenter slide, with time remaining indicator, that then goes immediately to the next slide. Fantastic! Or, so I thought.

When I arrived at the event and did the tech rehearsal, what I discovered is that Powerpoint running on a Mac has a different idea about what 20 seconds is, to Keynote on an iPad, and I had tested my stack against my iPhone’s stopwatch. With Powerpoint set to 20 seconds and no transition time, the iPad about 5 seconds behind by the end of my last slide.

So, no problems, I’ll just edit each of my slides to remove 0.25 seconds from the Move action, that’ll compensate and I’ll be up to speed.

Nope!

While Keynote for iPad knows that the Action stage of the build process exists, and will play the Action stage, it offers no way to create or edit an Action.

As you can see from the video linked on the left here, when you go to add a Build In or Out, you can see that there are actions on the object when you go to look at the Build Order, but there’s no option to adjust it.

A final gripe about the Mac version of Keynote, selecting an image, right clicking and choosing “Replace Image” brings up an iOS style image picker that only shows you the contents of your system’s photo library – in my case, Aperture. It locks you out of accessing your actual filesystem, where for example, you might have kept all the images you’re planning to use within a main project folder in ~/Documents.

This is a symptom of the overall problem I’ve found trying to use the iPad to get actual work done. If everything you do is in a single app, like say, drawing in Procreate, it’s fantastic. But, if your task is assembly, bringing together media from multiple sources, tweaking and adjusting etc, the fundamental nature of iOS – its inability to be file-centric, the way Finder makes a Mac, causes tasks that are mundane and easy, to be like trying to run, while up to your waist in water.

iCould Drive is not a solution – anyone who thinks that the way to move files between devices, is to send them through a server on the other side of the world needs to be sentenced to a year on dialup. iOS needs full peer networking with Macs. It needs the ability to access, and be accessed by filesharing with the same capabilities as the Mac. Finally, it needs to ditch this ridiculous notion that data and documents are contained within apps themselves. I should be able to delete an app, without losing anything I’ve done using that app.


When your Mac Refuses to Sleep.

The Symptoms:

You put your mac to seep, the screen(s) go black, but the machine doesn’t power down its drive(s) and sleep. Pressing a mouse button or keyboard key brings the screen(s) straight back up.

The problem persists through reboots, logouts, changing user accounts, deleting com.apple.PowerManagement.plist, and even doing a full SMC reset.

The Solution:

Go to the Printers & Scanners preference pane, and make sure you haven’t got a printer with a paused print queue, or a printer which isn’t actually connected any more, which you may have accidentally sent a job to, which the system is trying to find.

If this article was of use, a donation would help support my projects.