Still, this is not a very forward-thinking solution. Building and combining microservices – effectively UNIX philosophy applied to the web – is the most effective way to make progress.
EDIT: Seems like I misunderstood the article – from the way I read it, it sounded like Google has a monolithic codebase, with heavily dependent products, deployed monolithically. As zaphar mentioned, it turns out this is just bad phrasing in the article and me misunderstanding that phrasing.
I take everything back I said and claim the opposite.
Yes, I'm sure Google has a lot to learn from SoundCloud about how to deploy software at scale, like that time NASA got advice from Estes Industries on how to launch rockets.
The issue was that they wanted to load the page – with the user logged in, etc – on com.google. For this they implemented an explicit URL parameter that would allow this.
> Attackers could have seized on the omission of the X-Frame-Options header to change a user's search settings, including turning off SafeSearch filters
It explains how even in a small company modularization can help extremely. Now look at Google, where some issues (like the google.com april 1st XSS issue) were only fixed after outsiders mentioned it.
Usually internally the team responsible for that part should have cought that.
Google runs practically everything internally as services. Nothing about the code repository makes it impossible to run microservices. Where did you get the idea that google runs a single monolithic app for everything?
You should think of Piper as a single filesystem which permits atomic multi-file edits. And that's about it; there's nothing in that which forces any particular release structure on you.
The talk did mention (briefly) that monolithic codebase and monolithic binaries/software aren't strictly related. It's likely that monolithic software is easier in a monolithic codebase, but I don't think that microservices are harder in a monolithic codebase. Yes, we tend to statically link, but that's for library dependencies (i.e. things like the protocol buffers libraries/definitions). I don't work at all in this area, so this is a guess, but I imagine that it's extremely rare for unrelated teams to have application logic linked into the same binary - just making a release would involve coordination from so many teams :)
Still, this is not a very forward-thinking solution. Building and combining microservices – effectively UNIX philosophy applied to the web – is the most effective way to make progress.
EDIT: Seems like I misunderstood the article – from the way I read it, it sounded like Google has a monolithic codebase, with heavily dependent products, deployed monolithically. As zaphar mentioned, it turns out this is just bad phrasing in the article and me misunderstanding that phrasing.
I take everything back I said and claim the opposite.