After leaving Google and working in the open source ecosystem for the past few months, it’s become increasingly clear to me which pieces of Google’s infrastructure are ahead of the curve and which are not. One piece that’s clearly ahead is Google’s code review tool.
Google’s original code review tool was Guido van Rossum’s Mondrian, which he eventually open sourced as Rietveld, a project which was in turn forked into Gerrit. Mondrian has since been replaced by a newer system at Google but, to my knowledge, this new system has never been publicly discussed.
These code review tools all excel at showing side-by-side diffs. The difference between inline and side-by-side (two column) diffs is so dramatic that I refuse to waste mental energy trying to grok the inline diffs that github insists on showing.
There are a few ways to get two column diffs out of git (“git difftool”), but they all have their problems:
- Many diff tools (e.g. tkdiff) are XWindows programs are clearly out of place in Mac OS X. They often don’t work well with the app switcher and or show high resolution (“retina”) type.
- Most diff tools want to operate on pairs of files. tkdiff and p4merge show you one file at a time in isolation. Once you advance past a file, you can’t go back. I like to flip back and forth through files when viewing diffs.
- They typically do not support syntax highlighting.
There are certainly diff tools for Mac OS X that do all these things well, but they tend to be commercial.
Enter “git webdiff“, my newly-released side project which aims to improve this situation.
Any time you’d run “git diff”, you can run “git webdiff” instead to get a nice, two-column diff in your browser, complete with syntax highlighting. Here’s what it looks like:
When you run “git webdiff”, your browser will pop open a tab with this UI. When you’re done, you close the tab and the diff command will terminate. It works quite well any time you have an even remotely complex diff or merge to work through.
You can install it with:
pip install webdiff
Please give it a try and let me know what you think!
The last project I worked on at Google recently launched: a new and improved Finance Onebox.
You trigger the feature by searching for a stock ticker, e.g. AAPL, GOOG or .INX:
For comparison, here’s what it used to look like:
The main new features are:
- A larger chart
- A cleaner, more modern design
- More relevant attributes (P/E ratio, Dividend Yield)
- After-hours trading on the stock chart
The interactivity comes in a few forms. First, you can click the tabs on the top to zoom out to 5 day, 1 month, 3 month, 1 year, 5 year or Max views.
Next, you can mouse over the chart to see the price at any point. On touch devices, tap/swipe with one finger:
But wait, there’s more! What if you see a change on the chart and want to know how large it was? You can click and drag across that time range to see a delta. On touch devices, you trigger this by putting two fingers on the chart:
This is particularly useful for longer-range charts, where it lets you easily answer questions like “how much did the S&P 500 drop from 2008–2009?”
After seeing this image posted on reddit last week, I took a deep dive into the strange world of extreme caving.
This image is big! Click through to see the whole thing.
The Krubera Cave is the deepest in the world, descending 2,197 meters from its inconspicuous entrance to its deepest explored areas. It’s located in Abkhazia, a breakaway territory in the Republic of Georgia. In some ways, caving is an even more extreme activity than high altitude climbing. Descending to the bottom of Krubera takes a team of dozens of people over a month, a month during which they’ll never see the sun.
One question I had was “why does an expedition end?” I got some answers from this amazing documentary about a 2003 expedition. Their goal was to explore this siphon at 1440m below the surface:
Read the rest of this entry »
This past fall, my group launched a new feature on Google Search that we call “Fact Comparisons”. It triggers for many numeric fact queries, for example distance from the sun to mars:
The idea is that, by showing you answers to related questions (“how far is jupiter from the sun?”), we can help you contextualize the answer to your original question. The number “141,600,000 miles” is hard to fathom, but it makes more sense when you see that it’s between Earth (92.96M mi) and Jupiter (483.8M mi).
If you click on one of the related images, you’ll launch into a Carousel filled with related facts:
Most numeric facts will trigger this feature. The most popular numeric facts are people’s ages and heights. A strip of famous people’s ages is pretty interesting:
A few other fun ones:
I’ve referenced an anecdote from Vernor Vinge’s A Fire upon the Deep several times during video calls in the last few weeks and thought I’d share it here.
The novel is a classic space opera. Two ships with infinitely powerful computers are in communication over a very narrow channel. Rather than send pixelated images to one another, the computers go to extremes to make the best use of their limited bandwidth.
Fleet Central refused the full video link coming from the Out of Band … Kjet had to settle for a combat link: The screen showed a color image with high resolution. Looking at it carefully, one realized the thing was a poor evocation…. Kjet recognized Owner Limmende and Jan Skrits, her chief of staff, but they looked several years out of style: old video matched with the transmitted animation cues. The actual communication channel was less than four thousand bits per second
Precious bits aren’t wasted on low-level features like pixels. Rather, they’re used to transmit information about the participants and “animation cues”. The computer on the receiving ship creates the most realistic video it can using those cues and the imagery that it has on file.
The picture was crisp and clear, but when the figures moved it was with cartoonlike awkwardness. And some of the faces belonged to people Kjet knew had been transferred before the fall of Sjandra Kei. The processors here on the Ølvira were taking the narrowband signal from Fleet Central, fleshing it out with detailed (and out of date) background and evoking the image shown. No more evocations after this, Svensndot promised himself, at least while we’re down here.
Vernor Vinge calls it an evocation but we’d probably call it an avatar. Even with all its power, their computer has trouble reproducing motion and is constrained by its out-of-date database.
“Strange,” interrupted Pham. “The pictures were strange.” His tone was drifty.
“You mean our relay from Fleet Central?” Svensndot explained about the narrow bandwidth and the crummy performance of his ship’s processors…
“And so their picture of us must have been equally bad… I wonder what they thought I was?”
“Unh…” Good question. … “… wait a minute. That’s not how evocations work. I’m sure they got a pretty clear view of you. See, a few high-resolution pics would get sent at the beginning of the session. Then those would be used as the base for the animation.”
This is more or less how a digital video is encoded: there are a few high-resolution key frames followed by information about how the pixels change in each successive animation frame. Vinge’s computers do something similar. But instead of encoding how the pixels change, they encode at a higher level of abstraction. Presumably they’re recording that a hand moved or that a particular word was said, and the computer at the other end will do its best to update the keyframe to reflect this.
In a world with limited bandwidth but infinite computing power, this is how things should work. Next time you’re in a video call and the image drops out or becomes grainy, think about how much better it would be if your counterpart turned into a cartoon instead!
« Previous entries
Next Page »