Sperner’s Lemma

New York Times has a fun article today on using Sperner’s Lemma to find fair rents for roommates.  It can be used to calculate a fair division in other game theory scenarios, too, and I’d imagine there are other interesting uses for it in business dealings. 

At each corner one “thing” shoulders the entire shared cost, and at the other two corers it’s free. Divide the triangle into smaller triangles, and each interior point represents divisions of those prices. The more triangles you divide it into, the more “fair” the prices become. Each participant decides if they’d be willing to accept one of the items at that cost.

Sperner’s Lemma is interesting because it bypasses potentially complex calculations for objective value, and players may give different values to things. Instead this “feels them out” to see what’s fair. The further towards the center of the triangle you get, the harder the decisions become for the players.

image

Why HBO’s “Silicon Valley” Would Never Happen In The Real World

image

The new HBO series ‘Silicon Valley' has been really fun. Mike Judge has again created something hilarious and quotable, targeting a community that is ripe for parody.

There’s just one problem: The premise of the show is almost certainly impossible.  

In the premiere our main character Richard works at a large tech company named “Hooli”, a play on Google.

Richard has built a music program on the side, which includes a revolutionary compression algorithm. When Hooli finds out about it, they try to buy it from Richard for ten million dollars. Other entrepreneurs circle, attempting to buy in.  

Hold on—would Hooli (Google) really try to buy a piece of software from one of its own employees? Back when I started at Intel, I had to sign an agreement that basically said that the company owned anything I created while employed there, regardless of where or when I might’ve conceived or constructed it. Companies like Intel, Google or Apple own their employees and all of their creative output.  

In the real world, Hooli (Google) would have just taken Richard’s compression algorithm, as Richard would’ve certainly signed something saying they owned it. If he tried to resist, they would probably sue him.

As I mentioned in a previous post, I flirted with a software engineer position at Apple, but was scared off by a similar draconian “employee ownership” policy. I asked the interviewer if I would be allowed to have any other business dealings outside of work hours, and his reply was “maybe if you were, like, selling oranges on the side of the road… but other than that, no.” You could have absolutely nothing to do with software after hours, unless you had zero aspiration of exposing it to the world.

But ‘Silicon Valley’ is still funny, and I understand that the “Hooli” setup provides great opportunities to poke fun at the “Google culture.” But I worry young entrepreneuring programmers might be misled into thinking that a job at one of these tech behemoths is a step in the right direction towards building something of your own. You might be able to work there a couple of years, learn valuable skills, leave, and then start something of your own. But are you sure you won’t come up with your great idea while still working for “mega-company”? And if you do, are you going to hide or destroy any evidence of its existence that might date it to your time at mega-company? If your idea or product is really good, are you willing to take that chance?

image

Silicon Valley is airing Sunday nights on HBO.

Being the Only Engineer in a Business Meeting

(via The Expert, A Hilarious Sketch About the Pain of Being the Only Engineer in a Business Meeting)

The pain of being the only engineer in a business meeting is perfectly illustrated in the comedy sketch, “The Expert.” The sketch was written and directed by Lauris Beinerts and is based on the (Russian language) short story, “The Meeting,” by Alexey Berezin.

The Digital Mechanics: App Developers Without An App Of Their Own

Andrew and I attended a Portland coders meeting last month, where two developers who worked for a large company gave presentations on recent efforts. One shared his custom wrapper of an existing iOS class. What struck me was how the presenter imagined out loud how his code might be utilized: “So let’s say your boss comes to you and asks you to do X…” and “So if your boss wants you to move things around…” Why was a boss’ command the example that popped into his mind? Wouldn’t he ever use his code in his own projects, or in any circumstance of his own volition?! No, “The Boss” comes down from on high, gracing the lowly programmer with his great wisdom, and decrees what new features shall be wrought. It felt gross. What’s it like to be someone who’s imagination only takes them as far as what potential orders they may be given?

These are The Digital Mechanics. They make a good living applying technical skill, enough to live comfortably and support a family. While many programmer job listings still request at least a “BS in Computer Science” (an acronym I’m quite fond of), “equivalent experience” is usually accepted. So without racking up debt on a four-year degree, a person can get a good programming job as long as they’re self-motivated. 

If a car mechanic wants to build a car, they need a lot of physical parts. To design and build a line of cars, they’d have their work cut out for them, and need some serious startup capital. Programmers have the huge advantage of working mostly within a digital world, and the things they create have minimal real-world costs. The line between “digital-mechanic” and “digital-creator” is much easier to cross for those so inspired.

It would probably be “cooler” for me to claim I’d always wanted to be a programmer, that it’s my true passion and calling in life, but that just isn’t true. My interests have varied over the years, but the common thread is they all represent creative opportunities. When I was eighteen, I wanted to write songs and play in a band, so I learned to play music. When I needed artwork and a website, I learned how to create those things. When iOS was released, I had apps I wanted to make, so I learned how to build them. Any skill that I’m even marginally good at came about because I wanted to make something. And beyond those initial creative endeavors, I was usually able to apply the new skills to freelance work.

I’m always surprised to meet a developer who doesn’t have any independent products. Did they only aspire to be a “digital-mechanic”? Perhaps they went to school for computer science, took the nice day job, and that’s the end of the story?

Years ago I took piano lessons from a USC doctoral student in music. He was a great pianist and teacher, highly intelligent, and had encyclopedic knowledge of classical music and many other subjects. Being a songwriter, I once asked him if he wrote music. “Oh gosh, no,” he said, “what could I possibly add to what’s already out there? What do I have to say, and who would care?” I was disappointed, but didn’t think his modesty, shyness or embarrassment were surprising. He knew and performed many “Great Works”. How could he ever match them? 

These fears are the sticky trap we all have to wade through, and some seem to give up all creative hopes before realizing they can escape its hold. Ira Glass has a nice interview clip where he talks about “the gap” between a beginner’s creative work, which might not be very good yet, and their tastes, which may be highly refined. This gap can be discouraging. A creative beginner needs to remember it’s OK to not be great out of the gate. 

I’ve built and released several independent apps which provide a modest revenue stream, but their creation also adds value to my freelance work. It means I might’ve already had a chance to work with new frameworks in the iOS SDK, or might have used some new third-party tools that will benefit a freelance job. Clients pay developers because they have an idea, but lack the skills to implement it. Who would they rather pay, someone who has frequently labored from idea all the way through to product launch and beyond, or someone who has only built what was on the blueprints handed to them? 

I recently considered taking a job at Apple as a software engineer, and went through a few phone interviews. The deal-breaker for me was that they would not allow me to do any other business related to programming in my off hours. I couldn’t release independent apps, no freelance work, and I couldn’t continue to operate as part of my two-man LLC, Secret Monkey Science. Taking that job would mean shutting down the part of my programming brain that isn’t just a “digital-mechanic.” I’d go nuts. What creative person would be OK with signing an agreement like that? Someone who might think “Yeah whatever, I hate computers anyway, and I wouldn’t even want to touch one outside of business hours”, or maybe someone who is so blindly in love with Apple that they’d want to give that corporation any and all of their creative output?

If Apple is comprised of “digital-mechanics all the way down,” it makes me wonder where the company is headed. Maybe that worked under an innovative and tyrannical Steve Jobs, but under Tim Cook? Maybe not.

If you want to be a programmer, is it just to pay the bills? Fair enough! We all have to eat. Or maybe it’s because you have things you want to make, and programming is the skill that will help free those pesky ideas from your brain? In my opinion, that’s much better, and you’ll be building great skills that will go towards general programming work. Even if the things you create aren’t great, or don’t take the world by storm, you’ve approached the craft from a more interesting angle, which will help prepare you for all the challenges ahead.

My Brain Has No Space For Your User Interface

My iPhone’s alarm went off at 5:30 this morning, and I walked to the kitchen to start my fancy coffee maker, with its LCD display and navigation. Then at my desk I booted up a Mac and a PC. I navigated several programs, each with their own custom UIs. If I need to drive somewhere, I’ll get to use my car’s now-archaic touchscreen interface. Later I might watch something on DirecTV, or load MLB.tv on the Xbox, play a blu-ray, or navigate through my smartTV for some streaming video.

How many different user interfaces am I expected to deal with now? Many have similarities, but each expects me to remember its navigation, commands, organization and quirks.

Behind these disparate UIs are companies fighting for market share.  Sometimes their efforts create UIs that scream “Look at how advanced our software is! Come join us in the future!” They’re trying too hard, and making a mess in the process. Other times, it feels like the company knew the physical product (blu-ray player, giant TV) is what was going to sell you, and I imagine the executives realizing “oh—and there needs to be some software on it, in order to do x,y and z that we’re advertising on the box—make sure we get that done before it ships, too.” 

Most days I use UIs from Apple’s OSX and iOS, which are pretty similar but still different, as Apple has decided to define them as separate ecosystems: One meant for fingers, and one for the mouse. Microsoft disagrees, and is using their “Metro” UI across platforms (PC, Xbox, Windows Phone). At least they’re trying to do something about all of this UI insanity. Then we have third-party applications built for these platforms, looking to impress us with all their UI innovations and divergence from conformity.

Cars, cable TV, blu-ray players, smart TVs, Wii, Playstation, Roku, Kindle, touch screen remotes, high-end coffee makers, fancy watches—they all present us with UIs that we’re expected to tuck away in our brains.  

I imagine this ‘UI storage’ area of my brain is like the box in my closet containing a rat’s nest of computer “dongly things" and cables. It retains most of this UI knowledge—and I can get at it—but I have to detangle it from fifty other UI assumptions I’ve gathered over the years. 

Let’s take something as simple as the ‘back’ button: We understand it, and we need it. One platform may have a hardware back button, while another wants you to press the ‘left’ arrow on a directional pad. On iOS, we have the hardware ‘home’ button. Browsers usually offer a software back button, and on OSX browsers you can also use a two-fingered trackpad swipe to go back. My car’s computer screen has a back button that looks like a U-turn sign laying on its side. On Roku, instead of pressing ‘left’ on the arrow pad to go back, you press ‘up’ while on the top row of an app’s main menu to go to the Roku home screen (there’s also a ‘home’ button, to be fair).

In 10 years, this UI list may look laughably small. We’ll probably be discussing the operating systems on our tube socks and dust pans. What can be done? As a developer, I fight this battle on software projects by using conventions where possible, keeping things clean and simple, and focusing on innovating the core functionality of the app—not the UI.  

What about the UIs we’re stuck with? You could start a letter-writing campaign to the worst-offending companies I guess, but that’s probably wasted energy. My “solution” is to simplify and clean up everything I can around the UIs I need to use.

While working with computer displays our eyes move constantly, jumping around windows, searching lines of text, finding what’s important and dismissing what’s irrelevant.  Only a small subset of pixels are probably related to what your eyes are trying to locate at any given time. Each of these small decision-events takes brain processing, and eliminating the unnecessary ones can help alleviate some of the fatigue. Use a solid color desktop, organize folders, simplify toolbars and shortcut bars, and resist the desire to keep everything open at once if you don’t need it.  Preventing the brain from burning energy on these unnecessary in-between tasks is like cutting calories from a diet.  Granted, there’s not much you can do for many of the non-computer UIs described above.  Your car’s UI probably isn’t going to get better, at least not until Apple’s “CarPlay” potentially takes off.  I hope the developers building these interfaces can either get their act together and start settling in on some conventions, or get weeded out of the market.

Discuss here on hacker news