Posts

Showing posts from December, 2021

Best of metadata in 2021

As it became our tradition, here are some highlights from my 2021 posts. Systems Foundational distributed systems papers There is plenty of room at the bottom Graviton2 and Graviton3 Fail-silent Corruption Execution Errors (CEEs) at CPU/cores SOSP21 conference (Day1) Using Lightweight Formal Methods to Validate a Key-Value Storage Node in Amazon S3 Building Distributed Systems With Stateright Sundial: Fault-tolerant Clock Synchronization for Datacenters Do tightly synchronized clocks help consensus? Databases Linearizability What's Really New with NewSQL? A read-only transaction anomaly under snapshot isolation FoundationDB Record Layer: A Multi-Tenant Structured Datastore Misc Learning a technical subject Your attitude determines your success Humans of Computer Systems: Obdurodon Facebook: The Inside Story (2020) by Steven Levy Previous years in review Year in review 2020 Year in review 2019 Year in review 2018 Research, writing, and career advice

Learning a technical subject

I love learning. I wanted to write about how I learn, so I can analyze if there is a method to this madness. I will first talk about what my learning process looks like in abstract terms, and then I'll give an analogy to make things more concrete and visual.   Learning is a messy process for me I know some very clear thinkers. They are very organized and methodical. I am not like that. These tidy thinkers seem to learn a new subject quickly (and effortlessly) by studying the rules of the subject and then deriving everything about that subject from that set of rules. They speak in precise statements and have clear and hard-set opinions about the subject. They seem to thrive most in theoretical subjects. In my observation those tidy learners are in the minority. Maybe the tidy thinkers are able to pull this feat off because they come from a neighboring domain/subject and map the context there to this subject quickly. But, again from my experience, it doesn't feel like that. It s

A read-only transaction anomaly under snapshot isolation

This paper, from Sigmod 2004, is short and sweet. Under snapshot isolation level, it shows a surprising example of a transaction history where the read-only transaction triggers a serialization anomaly, even when the update transactions are serializable. This is surprising because it was assumed that, under snapshot isolation, read-only transactions always execute serializably without ever needing to wait or abort because of concurrent update transactions. Background Snapshot isolation is an attractive consistency model for transactions.  Wikipedia has a very nice summary: In databases, and transaction processing (transaction management), snapshot isolation is a guarantee that all reads made in a transaction will see a consistent snapshot of the database (in practice it reads the last committed values that existed at the time it started), and the transaction itself will successfully commit only if no updates it has made conflict with any concurrent updates made since that snapshot. Sn

Humans of Computer Systems: Ted

Programming How did you learn to program? Through a project in high school.     Tell us about the most interesting/significant piece of code you wrote.   Electronic mail system. Who did you learn most from about computer systems?   Edsger Dijkstra https://www.cs.utexas.edu/users/EWD/     What is the best code you have seen?   IBM 360 operating system       What do you believe are the most important skills to be successful in your field?     There are many paths to success, and a variety of skills to get there - no MOST IMPORTANT       What quality or ability do you value most in a computer systems person? The ability to explain. Personal Which of your work/code/accomplishments are you most proud of? Read, eg, Erich Fromm - pride is not a quality that should be considered What comes to you easy that others find hard? What are your superpowers? Recursion is natural to me. What was a blessing in disguise for you? What seemed like a failure at the time but led to something better later

Graviton2 and Graviton3

Image
What do modern cloud workloads look like? And what does that have to do with new chip designs? I found these gems in Peter DeSantis's ReInvent20 and ReInvent21 talks. These talks are very informative and educational. Me likey! The speakers at ReInvent are not just introducing new products/services, but they are also explaining the thought processes behind them. To come up with this summary, I edited the YouTube video transcripts slightly (mostly shortening it). The presentation narratives have been really well planned, so this makes a good read I think. Graviton2 This part is from the ReInvent2020 talk from Peter DeSantis.   Graviton2 is the best performing general purpose processor in our cloud by a wide margin. It also offers significantly lower cost. And it's also the most power efficient processor we've ever deployed. Our plan was to build a processor that was optimized for AWS and modern cloud workloads. But, what do modern cloud workloads look like? Let's start by

Popular posts from this blog

Hints for Distributed Systems Design

Learning about distributed systems: where to start?

Making database systems usable

Looming Liability Machines (LLMs)

Foundational distributed systems papers

Advice to the young

Linearizability: A Correctness Condition for Concurrent Objects

Scalable OLTP in the Cloud: What’s the BIG DEAL?

Understanding the Performance Implications of Storage-Disaggregated Databases

Designing Data Intensive Applications (DDIA) Book