Abstraction Boundaries

Welcome to my website

HomeBlog

This is a phrase I picked up from my Compilers professor while working on finishing my CS degree at the U. When you google the word `Abstraction` the current results as of today 23-08-07 returned two definitions of the word and the right of it a link to the Wikipedia page for Abstraction (computer science)

"topics will vary in degrees of abstraction"

"geometric abstraction has been a mainstay in her work"

To me reading both of these definitions leaves something to be desired but that’s beside the point. As I kinda live in my bubble I’m not aware of many other uses of the word `abstraction` besides the use of describing a mental model or an analogy to something more concrete that the listener might not fully understand. I think another form of abstraction is Marketing. In the sense of what information the company would like to share with a potential customer for them to use a product, they are manufacturing. This is in my opinion an art form as well and some companies are better at this Art than others in the sense of the Abstraction/marketing campaigns they created are convincing enough that someone is convinced that the product is worth trying, consuming, or whatever verb is appropriate given the context.

Now you may say, “Zane, that is a bit of a stretch”, but look around or think about everything you have ever used or purchased. Did you buy it based on some marketing material and maybe a few opinions from friends and family or online material from people that seem to understand that area of the world? Do you understand all its complexities? I certainly don't and I claim that you don't either. Not in the sense that you can't learn all the details behind a product and fully understand what it is and what it relies on. But in the sense that you can't do that for everything you use or buy and be able to foresee where the product lies in the near or far future. Do you understand what your insurance is or how these laws apply to you and what they mean or cover?

Broadly I don't think this matters and is something you don't need to worry about. I don't have to understand how my car works to use it; why should I? Politics are complicated: why don't we vote for someone to understand the complexity for us? Dealing with unforeseen accidents is financially hard; why don't we pay for insurance to manage the complexity for us? Having something to eat every day of the year is hard to do on your own; why don't we have stores where we can get food from? Assigning value to things is hard; what if we have a form of currency in which to trade goods and let economics determine the value for which we should buy and sell goods? Now while I don't claim any opinions on these abstractions, they were all brought into this world to try and make a difficult part of this world similar to the everyday person. Which is a very good thing. Just think about the device you are reading this on. It’s pretty crazy that we can all share and consume information at the current rate with the addition of computers to this world.

Well, I am currently a CS undergrad learning about the iceberg which is computer science. In a sense that the abstraction that most people understand it as isn't remotely close to how the hardware that it’s physically bound to this world. And for how fast computers have changed the world around us has been integrated into our lives whether we know it or not, for better or worse. This has added an insane amount of complexity that I don't think people really understand—or to be honest, even care to understand.

Now, why am I even spending any time on this rant? Well, I don't claim to understand much of anything in this world at this point. I do understand that computers need to run on hardware which in a general sense is either a GPU or CPU or some other marvel of engineering but that is the only part of the software that really matters. Also known as machine code or which 1s and 0s need to be created in order for the physical device to do something. Well, 1s and 0s are pretty hard to get in the correct order so we add the first lay of abstraction in software. Assembly code which is a human-readable form of the 1s and 0s that are sent to the CPU. This seems good until you learn how a CPU actually works in practice and that this assembly code abstraction is really what the industry likes to call an API or Abstraction Boundary and that the hardware vendors can kinda do whatever they want to honor the meaning of this API. It seems fine; many companies add boundaries as a way to not paint themselves in a corner. Many companies separate the customer service part of the company from the part of the company building and working on the product. Marketing is also a form of this abstraction. The underlying product is very complicated; what message do we wanna share to help people understand why they should use our product? But these abstractions aren't always good. Some companies get sued when they sell something they don't actually have or—worse case—products fail. For software, assembly is also hard to write well so other smart ideas were added to the field like programming languages, and compilers, and hardware ideas like caching and other engineering attempts to make the thing better. While the intent of making the product better is good, an accident was introduced called Sceptre and Meltdown. Well, I’m not gonna try and explain this here as there are much better places that have better abstractions for explaining what Spectre and Meltdown are. And while I have many ideas on how to avoid this, none of them are passable without trillions of dollars and lots of trust. “Everything temporary is permanent,” a friend once told me.

Well, assembly code may be an abstraction we are stuck with for a very long time. Everything up from there is a changeable abstraction and to me the truly fun part of computer science. If you can figure out how to express any idea so that assembly instructions can represent that idea on one or more computers you can share that idea with almost anyone. Which to me is sorta a form of magic but I want to use that word carefully as it is often used to gloss over hard work. But being able to express ideas like video games and other forms of digital art in a space loosely bound by the physical world is really cool. Like, try explaining fortnight not just the game but everything needed to make it work to your grandma who grew up in the depression. At least mine would think you're insane.

To me, video games are my favorite example of an abstraction boundary. Not in the sense that I think they are perfect but the complexity of 100 players using a wide array of hardware all playing with each other in the illusion of real time is really cool. But also in a sense that anyone from kids to adults can pick up the controller and in a few minutes the software can teach you how to use the controller and try to communicate to some degree the rules of their abstraction that they have created. When this abstraction falls apart the game is hard or confusing to play or just plain not engaging enough to play.

To end this rant and try to give some thought-provoking value to you in exchange for reading this. For those that aren't technical, try reading how video games work or how real-time communication works and then explain it to another friend and try to answer their questions, for those who have some knowledge of programming try to explain all the abstractions required to understand and wrangle to build the game to a coworker or friend. Or better yet try and explain them to a willing non-technical friend. Do you use the same abstractions or do you create other abstractions to gloss over abstractions that you don't fully understand, or just feel to be a little excessive?

I like to use the Fortnite example as it is voluntarily played, implying it is decent software from a consumer perspective and I feel it covers a wide array of abstractions and complexity that they have had to wrangle and solve in software. From UI/UX, distributed systems, graphics, math and physics simulations, and much more I’m not even fully aware of yet. Well that doesn't cover all of it and has some overlap, but hopefully, you can agree it at least covers a wide part of computers.

When I try to explain what I’m learning in school to my friends I find things being more complex than they need to be or failing to even mention some abstractions entirely. This is most likely a failure on my part to articulate or fully understand why this is there. But also has me asking the question. Between the controller and what’s displayed on the screen and the 1's and 0's the CPU and GPU need to process in order to make that happen.

Can there be a better way? Which abstractions are actually necessary for the customer who bought a PS5 to play fortnight with his friend who plays on an x86 computer? Why are the abstractions different between the x86 CPU in a PS5 vs the Windows desktop computer someone else is using? Why can't I run fortnight on my old x86 Mac? Why is it not recommended to use Windows over Linux for gaming? If the same physical device is used for a popular enjoyable experience by millions why is most software we use for work and education frustrating and confusing to use Why did the abstraction that worked yesterday break today? Am I holding it wrong? What is the right abstraction boundary here?

If you have a comment or find a type feel free to make a pull request here.