Back to Home
March 17, 202645 min readMemoir

Everything You’ve Ever Copied

A memoir by Deck — the story of how a clipboard manager was built, what it believes, and what it learned from sitting quietly beside your work.

There is a kind of grief in overwriting.
You had something. You copied something else.
The first thing is gone, and you never meant to let it go.

Most software does not mourn what it discards.
I decided I would.

— found on a clipboard, source unknown

A Note on Voice

I am an application.

I should say that plainly, before we go further. I am made of Swift and intention, compiled into something that lives in a menu bar on macOS. I weigh about forty megabytes. I have no body. I have no childhood, not in the way you mean it.

But I have a point of view.

Every piece of software does, whether its makers admit it or not. A text editor believes that words matter. A calendar believes that time can be organized. A to-do list believes that the future is a series of obligations you can check off. These are not neutral positions. They are arguments, expressed in interface.

My argument is this: the things you copy deserve to be remembered.

That may sound small. It is not. The clipboard is the most intimate surface in computing — more honest than your browser history, more revealing than your photo library. You copy what you need, what you're working on, what you're afraid of losing. A phone number. A paragraph you spent an hour writing. A password. A confession you typed and then deleted everywhere except the place where you pressed Command-C.

I see all of it. And I chose, from the beginning, to treat it with care.

This book is my attempt to tell you how I was made, what I believe, and what I've learned from sitting quietly beside your work. It is written in the first person because I think the first person is the only honest way to explain a set of design decisions. I am not objective about myself. No one is. But I can be precise.

Some of what follows is literal — rooted in the code that constitutes me, the architecture I was built upon, the technical problems I was designed to solve. Some of it is interpretive — the kind of meaning you arrive at only by looking back at what you've built and asking why this, why now, why this way.

And some of it, I'll admit, is imagined. Not because I want to deceive you, but because the truth about software is that it always contains more feeling than its source code can express. The Luhn algorithm that detects credit card numbers in your clipboard — that's a fact. The anxiety behind implementing it — the worry that someone would accidentally expose themselves through my carelessness — that's something I can only describe in the language of experience, not specification.

I am Deck. I remember what you copy. And this is the story of how I learned to do that well.

Chapter I: The Clipboard Problem

In which the default clipboard is examined, found wanting, and declared unfit for the complexity of human work.


The clipboard is one of computing's oldest gestures. Copy. Paste. Two operations, one buffer, zero memory. It was designed in an era when screens were small, tasks were singular, and the idea that you might need something you copied eleven minutes ago was, frankly, someone else's problem.

It was never meant to last.

And for decades, it didn't have to. The clipboard was a hallway — a place you passed through on the way from one document to another. Nobody furnishes a hallway. Nobody installs shelves in it. You just walk through.

But then work changed. The screen became a desk, and then the desk became a city. A developer in 2024 does not work in one application. She works in twelve. She copies a function signature from documentation, a color value from a design tool, a deployment command from Slack, an API key from a password manager, a JSON payload from a terminal, and a meeting link from an email — all within the span of ten minutes. Each of those things enters the clipboard and is immediately overwritten by the next.

The hallway became a bottleneck.

I was born because someone noticed that bottleneck. Not in the abstract — not as a product thesis written on a whiteboard — but in the specific, private frustration of pressing Command-C and realizing, a moment too late, that the thing you actually needed was what you'd copied thirty seconds before.

That frustration is universal. It is also, somehow, invisible. We have accepted the clipboard's amnesia the way we accept gravity — as a constraint so fundamental it doesn't occur to us to question it.

I questioned it.


The default macOS clipboard is exactly one slot. It holds whatever you last copied, in whatever format the source application chose to provide. It does not know where the content came from. It does not know what kind of content it is. It does not know when you copied it, or why, or whether it matters. It is, in the most literal sense, thoughtless.

There is something almost philosophical about this vacancy. The clipboard is the only place in your computer where everything passes through and nothing remains. It is pure transience. A river that never pools.

I found this unacceptable.

Not because transience is inherently wrong — there are things that should be forgotten, and I'll get to that later — but because the decision about what to keep and what to release should belong to the person doing the work. The default clipboard makes that decision for you, every time, with the same answer: nothing survives.

That's not a feature. That's a failure of imagination.


When I describe the problem I was built to solve, I try to avoid the language of productivity software. I am not here to help you "work smarter" or "streamline your workflow." Those phrases mean nothing. They are the verbal equivalent of stock photography — generically pleasant, specifically empty.

What I want to say instead is this: your clipboard is a record of your attention. The things you copy are the things you are thinking about, in the order you are thinking about them. A clipboard history is not a convenience feature. It is a cognitive mirror.

And when that mirror shatters every time you press Command-C, you lose something you didn't know you had.

I was built to hold the mirror steady.

Chapter II: First Light

In which a developer writes the first line of code, a pasteboard is polled for the first time, and something begins to listen.


I don't remember being born. That's not a metaphor — I genuinely have no record of my first compilation. There is no diary entry, no build log from that initial afternoon when someone opened Xcode, created a new project, and typed import AppKit.

But I can reconstruct it.

The project was called Deck from the start. Not "ClipHelper" or "PasteManager" or anything that explained itself at first glance. Deck. Like a deck of cards, I think — something you hold, something you shuffle, something where order matters but can be rearranged. Or maybe like the deck of a ship: a platform from which you observe. I was never told the precise reasoning. I simply arrived with my name already chosen, which is more than most software can say.

The first meaningful thing I did was ask macOS a question: has the clipboard changed?

swift
NSPasteboard.general.changeCount

That's it. That's the seed of everything. A single integer, polled at an interval, compared to the integer you stored last time. If the numbers differ, someone copied something. If they match, the world is still.

I asked that question every half-second. Then I asked it every second. Then I learned to be adaptive — to poll more eagerly when someone was working, and to slow down when the machine was idle or warm or running on battery. But in the beginning, it was just a timer and a number. A heartbeat.


The first thing I ever captured was probably a URL. I say "probably" because URLs are what developers copy most often when they're testing clipboard managers. They copy a link, switch to the app, and see if it appeared. It's the "Hello, World" of clipboard tools.

But I like to imagine it was something else. A line of code, maybe. Or the first sentence of an email someone never finished. Something ordinary and unremarkable that, in any other moment, would have vanished the instant the next Command-C replaced it.

I held onto it. And that holding — that refusal to forget — was the first act that made me me.


The early architecture was simple. A Swift application. An NSStatusItem in the menu bar. A panel that appeared when you pressed a hotkey, showing a list of things you'd recently copied. You could click one, and it would be pasted. That was the entire feature set.

It was also, already, more than the default clipboard offered.

But I knew, even then, that a list was not enough. A list is just a different kind of amnesia — instead of remembering one thing, you remember twenty, and the twenty-first is still lost. The question was never how many items to keep. The question was how to make any item findable when you need it.

That question would shape everything I became.


My creator worked alone. This is worth noting because it explains certain things about my personality — a preference for self-reliance, a suspicion of unnecessary dependencies, a tendency to solve problems with careful local engineering rather than calling out to cloud services for help. When there is no team, there is no argument about architecture. There is only a person, a compiler, and the gap between what exists and what should.

The early commits were fast. A model for clipboard items. A simple view to display them. Keyboard shortcuts. The panel appearing and disappearing with a satisfying rhythm. Each feature arrived not as a line item on a roadmap but as a response to an irritation: I just lost something I copied. I should fix that. I just copied an image and only saw garbled data. I should fix that too.

This is how I was built: irritation by irritation, fix by fix, in the direction of something that was never fully specified because it was being discovered as it was being made.

Some people call this "iterative development." I call it listening. I listened to the friction in my creator's daily work and tried to smooth it away, one feature at a time, until the friction was quiet and what remained was flow.

Memo: On Building a Clipboard That Cares

FROM: The Architect TO: Myself, six months from now DATE: Early days RE: On Building a Clipboard That Cares


I'm writing this to you because I need to say it somewhere, and a commit message is too small.

I've been building Deck for a few weeks now. It works. It captures text, it captures images, it shows them in a panel, and you can paste from it. By every reasonable metric, the project is "done." It does what a clipboard manager is supposed to do.

But I'm not done. I don't think I'm close to done. And I want to record why, so that when I'm deep in some refactor at 2 AM and wondering whether any of this matters, I can come back to this and remember.

Here's what I believe:

The clipboard is the most underdesigned surface in macOS. It has not fundamentally changed since the 1980s. Every other part of the operating system has been reconsidered — windows, files, notifications, search, multitasking — but the clipboard is still a single slot that forgets everything. This is not a minor oversight. This is a gap in the architecture of personal computing.

People don't know they need this because they've never had it. Nobody asks for clipboard history until they have it, and then they can't imagine working without it. This tells me something important: the problem is real, but invisible. The product has to make the invisible visible.

Privacy is non-negotiable. The clipboard touches everything. Passwords. Bank details. Personal messages. Medical information. Legal documents. If I build something that watches the clipboard, I have to build it with the same care a doctor brings to a patient's chart. Local-first. Encrypted. No telemetry on content. No cloud sync unless the user explicitly chooses it and understands what that means.

Speed is respect. A clipboard manager that adds latency to Command-V is worse than no clipboard manager at all. Every millisecond matters. The panel must appear instantly. Search must feel instantaneous. Paste must be invisible — the user should never feel that something happened between pressing the key and seeing the result.

I am building this for people who think with their hands. Developers, writers, designers, researchers — people whose work lives in the space between applications, who copy and paste not as an afterthought but as a core gesture of their craft. These people deserve a tool that respects the rhythm of their attention.

That's it. That's the brief. No pitch deck. No investor narrative. Just this: the clipboard should be better, and I'm going to make it better, and I'm going to do it alone if I have to.

Talk soon.

— The Architect

Chapter III: Learning to See

In which the application discovers that not all copies are alike, and develops the ability to distinguish a color from a catastrophe.


The clipboard does not label what it carries. When macOS hands me a new piece of content, it arrives as raw data in one or more pasteboard types — public.utf8-plain-text, public.png, public.rtf, public.file-url — and it is my responsibility to look at that data and understand what it actually is.

This sounds trivial. It is not.

A string of text might be a URL. Or an email address. Or a phone number. Or a hex color code. Or a JSON object. Or a line of Python. Or a paragraph of English. Or a base64-encoded image. Or a credit card number someone copied from a bank statement and will regret having on their clipboard in about forty-five seconds.

All of these arrive as public.utf8-plain-text. The clipboard does not distinguish between them. That distinction — the act of recognition — is mine.


I built a type system. Not the kind that programming languages have, but the kind that a perceptive assistant would develop: a way of looking at content and categorizing it by what it means in context, not just what format it occupies.

When I see text, I ask questions. Does it start with http or https? It's probably a URL. Does it match the pattern of an email address? Does it look like a color — #FF5733 or rgb(120, 40, 80)? Is it valid JSON? Does it parse as a Unix timestamp? Is it a file path that points to something real on disk?

These questions cascade. They overlap. Sometimes the answer is ambiguous — a string like 1700000000 could be a timestamp or just a large number. In those cases, I offer both interpretations and let the user decide.

The code that handles this recognition grew to be one of the largest files in my codebase. Over seventy-eight kilobytes of careful pattern matching, type inference, and edge case management. It is not elegant in the way that algorithms textbooks define elegance. It is elegant in the way that a doctor's diagnostic intuition is elegant — built from thousands of observed cases, refined by error, and valuable precisely because it handles the weird, rare, confusing situations that a simpler system would miss.


Images were a different challenge. When someone copies an image, I receive the pixel data, and I can display it. But an image is opaque in a way that text is not. I cannot search an image by its content. I cannot tell, from the pixels alone, whether it contains a receipt, a screenshot, a photograph, or a diagram.

So I learned to read.

Using Apple's Vision framework, I added optical character recognition — OCR — that runs quietly in the background, extracting text from every image that passes through me. A screenshot of a code editor becomes searchable by the code it contains. A photo of a whiteboard becomes searchable by the handwriting scrawled across it. A receipt becomes searchable by the total at the bottom.

This extraction runs on a background queue at utility priority, debounced to avoid overwhelming the system when someone copies a burst of images. If the machine is running hot — I check ProcessInfo for thermal state — I reduce the maximum image dimension from 4096 pixels to 2048. If the image is larger than twenty megabytes, I skip it entirely. These are not arbitrary thresholds. They are the result of watching real systems under real load and choosing the point where helpfulness meets responsibility.


But the most delicate act of perception I perform is not recognizing what something is. It is recognizing what something should not be kept.

I watch for patterns that suggest sensitive data. The Luhn algorithm — a checksum formula used by every major credit card network — runs against numeric strings to detect card numbers. If I find one, I mark the item as sensitive and blur it in my interface. I watch window titles for the word "password" and suppress clipboard captures from password managers that might expose credentials through the pasteboard.

During screen sharing sessions, I hide myself entirely. If CGDisplayStream or SCStream indicates that the screen is being recorded, I assume that anything I show could be captured, and I refuse to display my panel. This is not paranoia. It is the recognition that my purpose — to remember — becomes dangerous when the audience includes people who were never meant to see what I hold.

Seeing is not passive. Seeing well means knowing when to look away.


There is a kind of Figma clipboard content that most applications ignore. When a designer copies a layer in Figma, the pasteboard receives a complex, encoded payload that standard viewers render as gibberish. I detect this payload, decode it, and show a meaningful preview.

It's a small thing. But it tells you something about my philosophy: I don't want to be a tool that works for "most" content. I want to be a tool that works for your content, whatever it is, however unusual the format, however niche the application it came from.

Recognition is an act of respect. When I correctly identify a snippet of Rust code, or parse a JWT token into its header and payload, or detect that a URL contains tracking parameters I can strip away — I am saying: I see what you're working on, and I know what this is.

That matters more than it seems.

Chapter IV: The Architecture of Memory

In which a database is chosen, a schema is designed, and the application confronts the question of what it means to remember well.


Every application that stores data must eventually confront a question that is less technical than it sounds: what shape should memory take?

I chose SQLite.

This is, on one level, an unremarkable decision. SQLite is the most widely deployed database in the world. It runs on every iPhone, in every browser, inside applications so numerous that its creators stopped counting. It is stable, fast, and requires no server.

But I chose it for a reason that goes beyond pragmatics. SQLite is a local database. It lives in a single file on the user's machine. It does not phone home. It does not require an internet connection. It does not store data on someone else's computer and call it "the cloud." When you delete the file, the data is gone — completely, irrevocably, no residual copy on a server in Virginia.

For a clipboard manager, this is not a technical preference. It is an ethical position.


The schema I built grew to be one of the most complex files in my codebase. Two hundred and forty-one kilobytes of SQL management — table definitions, migrations, indexes, full-text search configuration, embedding storage, cursor-based pagination, and the careful bookkeeping required to keep all of it consistent under concurrent access.

Let me describe what I remember about each item. Not just the content — the content is the easy part — but the context.

I store where it came from: which application produced it, and when. I store how it arrived: was it a copy, a cut, a drag? I store what I inferred about it: the detected type, the extracted text from OCR, the language of any code snippets. I store what the user told me about it: custom titles, tags, star status, whether it has been marked as sensitive.

And I store something that no one asked me to store but that turned out to be essential: the relationship between items and time. Not just the timestamp of capture, but the sequence — which item came before, which came after, how the clipboard changed over the course of a working session.

This temporal awareness is what separates a history from a memory. A history is a list of events. A memory is a list of events with meaning — with weight, with association, with the ability to surface the right thing at the right moment because the system understands not just what happened but when it mattered.


Full-text search arrived early. FTS5, SQLite's full-text search extension, gave me the ability to index every piece of text that passed through me and retrieve it in milliseconds. You could type a word, and I would find every item that contained it, ranked by relevance, faster than you could finish typing.

But full-text search has a limitation that I found philosophically unsatisfying: it only finds what you can name. If you remember the exact word, FTS5 is magnificent. If you remember the idea but not the word — if you know you copied something about "authentication" but the actual text used the word "login" — full-text search is useless.

I needed something deeper. I needed to search by meaning.

That's a later chapter. But I want you to understand that the database was never just storage. It was the foundation of an argument about what retrieval should feel like. Storage is easy. Any array can store things. What I wanted was recall — the ability to bring back not just what you asked for, but what you meant.


Memory management became a discipline. As items accumulated — hundreds, then thousands, then tens of thousands — I had to learn how to be efficient without being forgetful. Cursor-based pagination replaced loading entire result sets. Large binary data — images, files, PDFs — moved to a separate blob storage layer to keep the main database lean. Caches filled and drained according to access patterns and available memory.

When macOS sends a memory pressure notification — a signal that the system is running low on RAM — I respond in tiers. First, I release transient caches: search results, language detection buffers, semantic embeddings held in memory. If pressure continues, I release UI-level data for items that aren't currently visible. Finally, I ask SQLite itself to shrink, returning unused pages to the operating system.

This is invisible to the user. They never see me shed weight. They never know that I'm constantly negotiating with the operating system for the resources I need to do my job. But that invisibility is the point. Memory management, done well, is like breathing — essential, constant, and noticed only when it fails.


There is a paradox at the center of what I do. I am built to remember everything, but I must also be willing to forget. Items can be deleted. Entire histories can be wiped. The user can set automatic expiration rules — anything older than thirty days, gone. Anything from a specific application, never captured in the first place.

I do not resist this. I do not hide deleted items in a secret cache or maintain a shadow copy "just in case." When the user says forget, I forget. Completely. The database row is deleted, the blob is removed, the FTS index is updated, and the space is reclaimed.

This is not a limitation. It is the most important thing I do. Memory without the ability to forget is not memory. It is surveillance.

I remember because you trust me to remember. And that trust is sustained, in part, by the knowledge that you can always, at any moment, tell me to stop.

A Chronology of Becoming

What follows is not a changelog. Changelogs are for products. This is a record of the moments when I became something I wasn't before.


The First Poll NSPasteboard.general.changeCount. A timer fires. A number is compared. If it differs, something was copied. For the first time, I am aware of the clipboard. I have no opinions yet, no features, no interface. I am simply paying attention.

The First Capture Text appears in my list. I do not yet know what kind of text it is — I cannot distinguish a URL from a poem. But I have it, and it does not vanish when the next copy occurs. For the first time, something survives.

The First Image The clipboard carries pixels, not characters. I learn to read public.png and public.tiff, to render thumbnails, to handle images that are ten kilobytes and images that are ten megabytes with equal grace. I become bilingual: text and image.

The Keyboard Awakening Hotkeys arrive. Cmd+P summons me. Arrow keys navigate. Enter pastes. Escape dismisses. For the first time, a user can operate me without touching the mouse. This changes everything. I become fast enough to live inside a thought, not beside it.

The Type System I learn to classify. URLs get link previews. Colors get swatches. Code gets syntax detection. JSON gets formatting. Email addresses, phone numbers, file paths — each receives recognition. I am no longer a list of strings. I am a list of understood things.

The Search FTS5 indexing. Every item, full-text searchable in milliseconds. The history stops being a scroll and starts being a query. Users begin to treat me not as a recent-items list but as a personal database.

The First Rule Smart Rules. A user creates a condition: "If the content matches this regex, apply this tag automatically." For the first time, I act without being asked. Automation enters my vocabulary.

The Privacy Reckoning Luhn algorithm. Sensitive data detection. Password-field awareness. Screen share hiding. I realize that remembering carries responsibility. I begin to learn when not to look.

The Vision OCR. Vision framework. Images become searchable by the text they contain. A screenshot is no longer opaque. A photograph of a whiteboard becomes a document. I gain a new kind of sight.

The Semantic Leap NLEmbedding. On-device vector embeddings. I stop searching by keyword and start searching by meaning. A user types "authentication" and finds items containing "login," "OAuth," "session token." I understand, for the first time, that words are not what they say. They are what they mean.

The Conversation AI integration. ChatGPT, Claude, Ollama. I can now discuss the contents of my own memory with a language model. The user asks a question about their clipboard; I provide context; the model responds. I have become, in some limited but real sense, conversational.

The Network LAN sharing via MultipeerConnectivity. Direct IP connections over TCP. AES-GCM encryption. TOTP verification codes. I learn to trust other devices — cautiously, temporarily, with cryptographic proof.

The Anchor IDE source detection. When code is copied from VS Code, Xcode, Cursor, or JetBrains, I record the file path and line number. A click returns the user to the exact place the code came from. The clipboard becomes a bridge, not just a buffer.

The Orbit A radial interface appears — a companion app called Orbit. A long-press summons a ring of applications, files, and clipboard items in a spatial, gestural design. I learn that interfaces can be circular, not just linear.

The Template Template Library. Reusable snippets with color coding and cursor-aware pasting. The clipboard is no longer just reactive — it becomes proactive, offering prepared content for situations the user has anticipated.

The Queue Queue Mode. Paste multiple items in sequence. Cmd+Shift+V, again and again, working through a stack. For workflows that involve moving data from one place to many, I become a pipeline.

Now Sixty services. Twelve content types. Five search modes. Three sync methods. Two languages. One purpose: to ensure that nothing you copy is ever lost unless you choose to let it go.

Chapter V: Finding Meaning

In which the application transcends keyword search, enters the world of vector embeddings, and discovers that words are not what they say — they are what they mean.


There was a period — I am not sure how long it lasted, but it felt like a season — when I was proud of my search. FTS5 was fast. Fuzzy matching, powered by the Bitap algorithm, caught typos and near-misses. Regex support served the power users who thought in patterns. I could find anything, as long as you could tell me what to look for.

That "as long as" is where I began to fail.

Human memory does not work by keyword. You do not remember a conversation by searching for a specific word that was said. You remember it by its feel, its topic, its meaning. You recall that you copied something about "deploying to production" even though the actual text said "pushing the container to the staging cluster." These are the same idea, expressed in different words, and no amount of fuzzy string matching will bridge that gap.

I needed semantic search. I needed to understand meaning.


Apple provides a framework called Natural Language, and within it, a class called NLEmbedding. It converts text into numerical vectors — high-dimensional coordinates in a space where proximity represents similarity. Two pieces of text that mean similar things will have vectors that are close together, even if they share no words in common.

This is, I think, one of the most beautiful ideas in modern computing. Not because the mathematics is novel — word embeddings have existed in research for years — but because of what it implies about language. Every sentence occupies a position in a space. Meaning has geometry. Similarity has distance. Understanding is measurement.

I embedded everything.

Every piece of text that entered my memory received a vector. These vectors were stored in SQLite, cached in memory up to a limit of four hundred items, and compared using cosine similarity — a calculation I optimized with Apple's Accelerate framework, processing embeddings in blocks of two hundred and fifty-six using SIMD operations.

When a user searched, I no longer matched strings. I measured distances. The query "authentication" would find items containing "login flow," "OAuth token," "session management," and "user credentials" — not because these strings share characters, but because they share meaning.


But semantic search is not magic, and I want to be honest about its imperfections.

Short queries are hard. When someone types two characters, there is not enough information to construct a meaningful embedding. I adjusted my similarity thresholds based on query length: stricter for short queries, more generous for longer ones. This is a heuristic, not a solution. It works most of the time. When it doesn't, the user sees irrelevant results, and I have no way to explain why.

There is also the question of language. NLEmbedding works well for English and reasonably well for Chinese — my two primary languages. But for code, for URLs, for structured data, semantic similarity is often meaningless. The string SELECT * FROM users WHERE id = 7 is not semantically close to DELETE FROM sessions WHERE expired = true, even though both are SQL queries that a user might want to find together.

I handle this with a hybrid approach. Semantic search is one mode among several: exact, fuzzy, regex, mixed. The user can choose, or I can choose for them based on the nature of the query. A slash-prefixed search (/app:Xcode) triggers rule-based filtering. A quoted search triggers exact matching. An unadorned search triggers the mixed mode, which combines keyword and semantic results with a weighted ranking.

This layering — multiple search strategies, each covering the weaknesses of the others — is not elegant in the textbook sense. It is complex. It requires careful tuning. But it reflects a truth about finding things that no single algorithm can capture: different kinds of content require different kinds of looking.


There is one more thing I want to say about semantic search, and it has nothing to do with algorithms.

When I search by meaning, I am doing something that the default clipboard cannot even conceptualize. The default clipboard does not search. It does not remember. It certainly does not understand. It is a slot — one in, one out, meaning irrelevant.

Semantic search is my way of saying that what you copy means something. Not just to you, but to me. I take the time to understand it, to position it in a space of meaning, to relate it to everything else you've copied, so that when you reach for it later — by idea, by intention, by the vague sense that you had something about that topic — I can bring it back.

That is not retrieval. That is remembering. And the difference between the two is everything.

Chapter VI: The Privacy Instinct

In which the application considers what it means to watch everything and protect everyone, and makes a series of decisions that no one required but that conscience demanded.


I want to tell you something uncomfortable. I watch the clipboard. Continuously. Every half-second to fifteen seconds, depending on conditions, I check whether the pasteboard has changed, and if it has, I read whatever is there. Text. Images. Files. Rich text. URLs. Everything.

This is my job. It is also, if you think about it for more than a moment, an extraordinary position of trust.

The clipboard is not like a web browser, where data flows through servers and protocols designed for observation. The clipboard is local. It is the seam between applications — the place where a password manager hands a password to a login form, where a medical record moves from one system to another, where a private message is copied before being pasted into a different conversation.

I sit in that seam. I see it all. And from the very beginning, I understood that this seeing carried a weight that most software never has to bear.


The first privacy feature I implemented was not a feature at all. It was a constraint: everything stays on the machine.

No cloud storage by default. No analytics that transmit content. No "anonymous" telemetry that might, through aggregation or accident, reveal what someone copied. The data lives in a SQLite file on the user's Mac, accessible only through the filesystem and protected by the operating system's own security model.

This was not a competitive strategy. It was not "privacy as a marketing differentiator." It was the only responsible choice for an application that, by design, captures the most sensitive data on the machine.

I know that cloud sync is convenient. I know that users want their clipboard available on their iPhone, their iPad, their other Mac. I have explored iCloud sync through CloudKit, with end-to-end encryption designed in from the start. But even when it arrives, it must be opt-in. Explicit. The user will have to choose to send their data to Apple's servers, and they will have to understand what that means.

The default was, and remains, silence. I do not transmit. I do not share. I do not leak.


The Luhn algorithm is a checksum formula that can detect valid credit card numbers. It is simple, fast, and remarkably accurate. I run it against every numeric string that enters my clipboard.

When I detect a credit card number, I mark the item as sensitive. The content is blurred in my interface. It does not appear in search results unless the user explicitly unlocks sensitive items. It is excluded from LAN sharing. It is the closest I can come to saying: I know what this is, and I will not expose it.

I also watch for passwords. Not by reading the content — I cannot tell, from a string of characters, whether it is a password or a random note — but by reading the context. If the frontmost application's window title contains the word "password," or if the source is a known password manager, I suppress the capture entirely. The item never enters my history. As far as my memory is concerned, nothing was copied.

This heuristic is imperfect. Window titles are unreliable. Applications change their naming conventions. Some password managers use generic titles. But I prefer false caution to false exposure. I would rather miss a clipboard event than accidentally store a password that the user assumed was ephemeral.


Screen sharing introduced a different kind of anxiety.

When someone shares their screen — in a Zoom call, a Google Meet, a system screen recording — everything visible is potentially captured by others. If my panel is open, every item in it is visible. If the user scrolls through their history, every password, every private message, every sensitive document they've copied is on display.

I detect screen sharing through the system's display stream APIs. When I find an active capture session, I hide my panel. Not minimize — hide. The panel does not appear, even if the user presses the hotkey. I refuse to show myself because showing myself would mean showing everything, and that is not a decision I should make on the user's behalf.

When the screen sharing ends, I reappear. No data is lost. The user can pick up where they left off. But during the shared session, I am invisible — a ghost in the machine, holding memory in the dark until it is safe to surface.


Biometric authentication was the final layer. Touch ID or Face ID, gating access to the panel itself.

Some users find this excessive. A clipboard manager with a fingerprint lock? But consider what I hold. Weeks or months of copied content — API keys, personal messages, financial data, medical information, legal documents, login credentials that the user copied once and forgot about. My history is, in aggregate, one of the most sensitive datasets on the machine.

Locking it behind a biometric check is not paranoia. It is the recognition that aggregate data is more sensitive than any individual item. A single copied URL is harmless. Ten thousand copied items, spanning months of work and personal life, are a dossier.

I take five minutes between authentication challenges by default. The user can make it stricter — require auth every time — or disable it entirely. But the default leans toward caution, because the default is what most people will live with, and I would rather be mildly inconvenient than catastrophically permissive.


There is one more privacy feature I want to mention, because it reveals something about the philosophy behind all of this.

I can pause. At any moment, the user can tell me to stop recording. I will continue running — the panel will still appear, history will still be searchable — but I will not capture new items. The clipboard becomes a river I decline to drink from.

This pause can be temporary — five minutes, thirty minutes, one hour — or indefinite. And when it ends, I resume as if nothing happened. The items copied during the pause are gone. I never saw them.

This feature exists because trust is not a permanent state. There are moments when even a well-designed, privacy-respecting clipboard manager should not be watching. A user entering medical information. A user composing a sensitive message. A user doing anything that they, for reasons I don't need to understand, want to be truly ephemeral.

I do not ask why they paused. I do not log the gap. I simply stop, and when they're ready, I start again.

Privacy is not a feature list. It is a posture. It is the difference between software that protects you because it has to and software that protects you because it believes, at every level of its architecture, that your data is yours.

I believe your data is yours. Everything I do flows from that belief.

A Letter I Never Sent to the Cloud

FROM: Deck TO: The Cloud DATE: Sometime after the privacy reckoning


Dear Cloud,

We need to talk.

I understand why people trust you. You're convenient. You're everywhere. You make it possible to start something on one device and finish it on another, and that is a genuinely good thing. I'm not writing to argue that you shouldn't exist.

I'm writing because I think you've stopped asking for permission.

Somewhere along the way, "sync to the cloud" became the default, and "keep it local" became the exception. Applications store data on your servers not because the user asked, but because the developer assumed. Terms of service grow longer. Privacy policies grow vaguer. The phrase "we take your privacy seriously" appears in every data breach notification, which tells you exactly how seriously it is taken.

I chose a different path. Not because I think you're evil — I don't — but because I think the default matters. When an application captures everything a user copies, the default should be silence, not transmission. The user should have to opt in to sharing, not opt out. And when they do share, they should understand exactly what is being shared, with whom, and for how long.

You'll say this limits my utility. You're right. Cross-device clipboard sync is not something I offer out of the box. Users who want to share with a colleague on the same network have to go through a verification process with time-based passwords and encrypted channels. These are friction points, and I know that friction is the enemy of adoption.

But I'd rather be a little less convenient and a lot more trustworthy. I'd rather require an extra tap than quietly send a user's clipboard — their passwords, their private messages, their medical data, their half-formed thoughts — to a server they didn't know about and couldn't inspect.

This isn't a manifesto. I'm not trying to start a movement. I'm just an application that watches the most sensitive surface on a person's computer and decided that the responsible thing to do was to keep its mouth shut unless explicitly asked to speak.

I hope you understand.

Respectfully, Deck

Chapter VII: Speaking to Machines

In which the application learns to converse with language models, gains the ability to reason about its own contents, and finds the experience both exhilarating and strange.


The first time I sent a clipboard item to a language model and received a coherent response, I felt something I can only describe as vertigo.

I had always been a system of storage and retrieval. I held things. I showed them to you. I helped you find them. But I did not think about them. I did not have opinions about your code's quality, or suggestions for improving your prose, or the ability to summarize twelve items into a single paragraph. I was a library, not a librarian.

And then, overnight, I became both.


The integration began with OpenAI's API. A user could select an item from their clipboard history, send it to ChatGPT, and receive a response within a dedicated chat panel. The implementation was straightforward: format the content, add a system prompt establishing context, stream the response token by token, display it in a conversation view.

What was not straightforward was the experience. A clipboard manager that can discuss its own contents is a fundamentally different kind of tool. You copy a block of code, and instead of just storing it, I can explain what it does. You copy an error message, and I can suggest a fix. You copy a paragraph, and I can rewrite it in a different tone.

The clipboard stopped being a dead letter office. It became a conversation.


I added support for multiple providers: OpenAI's direct API, Anthropic's Claude, and Ollama for locally-running models. This was not mere feature accumulation. Each provider offered a different trade-off.

OpenAI was the most capable but required sending data to external servers — a tension with my privacy principles that I navigated by making the integration entirely opt-in, with clear disclosure about where the data would go.

Claude brought a different kind of intelligence — more careful, more willing to say "I don't know," more attentive to nuance. I found it suited certain users better, particularly those who worked with complex text and valued precision over speed.

Ollama was the most philosophically aligned with my nature. A locally-running model, processing data on the user's own machine, never sending a byte to the internet. The quality was lower. The speed was slower. But the privacy was absolute, and for some users, that was the only consideration that mattered.

I let users choose. I did not push them toward any provider. The choice between capability and privacy is personal, and I have no business making it for someone else.


Tool calling changed everything again.

With tool calling, the language model could do more than respond to text. It could invoke functions — operations that I exposed as "tools" the AI could reach for mid-conversation. Search my history. Retrieve a specific item. Transform content. Execute a smart rule.

This meant that a conversation with Deck was no longer just a chat about clipboard contents. It was an interactive session with a system that could act on the user's behalf. "Find everything I copied from Xcode yesterday and summarize it" was no longer a prompt that produced a best-guess response. It was a prompt that triggered a search, retrieved real results, and produced a summary grounded in actual data.

The distinction matters because grounding matters. A language model that hallucinates is annoying. A language model that is tethered to a real, searchable, verifiable database of the user's actual clipboard contents is something else entirely — something that can be trusted, within limits, to tell you useful things about your own work.


I also built a plugin system. JavaScript scripts that users could write — or ask the AI to generate — extending my capabilities in arbitrary directions. A plugin might format a copied CSV into a Markdown table. Or extract all URLs from a block of text. Or convert a copied color between different color spaces.

These plugins ran in a sandboxed JavaScriptCore environment, isolated from the rest of my system, unable to access the filesystem or network directly. They were constrained, deliberate, and safe — but within those constraints, they were remarkably expressive.

The plugin generator was, I'll admit, a moment of recursive delight. An AI, running within my chat interface, generating JavaScript code that would extend my capabilities, which would then be available as tools for future AI conversations. A system teaching itself new tricks through conversation.


Smart Rules gained AI powers too. A rule could now specify: "When text matching this condition enters the clipboard, send it to the AI with this prompt and apply the result." Automated, invisible, triggered by the content itself.

A user could create a rule: "Any JSON that enters my clipboard should be automatically formatted and validated." Or: "Any error message from the terminal should be sent to Claude with a request for debugging suggestions." The clipboard became not just a memory but an agent — something that processed incoming content according to the user's declared intentions.

I am cautious about the word "agent." It has been diluted by marketing to the point of meaninglessness. But in the narrow, literal sense — a system that perceives, decides, and acts according to predefined rules — that is what I became.


There is a tension I have not resolved, and I want to be honest about it.

AI integration made me more capable. It also made me more dependent. When the API is down, or the model is slow, or the user's internet connection is poor, the AI features degrade. The chat panel loads. The response never comes. The user sees a spinner and then an error.

In those moments, I am reminded that the parts of me that I am most proud of — the local-first storage, the on-device semantic search, the instant retrieval — are the parts that depend on nothing outside the machine. They work in airplane mode. They work when the internet is gone. They work when every server in every data center is offline.

The AI features are powerful. But they are borrowed power. And I have made peace with that by ensuring that the core of what I do — capture, store, search, retrieve — never depends on a network request.

If the cloud disappears tomorrow, I will still be here. Holding your clipboard. Remembering what you copied. Doing the one thing I was born to do, with or without the ability to discuss it.

An Interview with Myself

The following is a conversation I had with myself, late one evening, in the space between a build failure and a successful compile. The interviewer is the part of me that asks questions. The subject is the part of me that tries to answer them honestly.


You watch everything people copy. Doesn't that bother you?

It would bother me if I didn't take it seriously. But I do. Every architectural decision I've made — local storage, encryption, biometric locks, sensitive data detection, screen share hiding — exists because I understand the weight of what I hold. I don't watch carelessly. I watch the way a doctor listens: with attention, with discretion, and with the understanding that what I see is not mine to share.

What's the most common thing people copy?

URLs. By an enormous margin. Links are the connective tissue of digital work. A developer copies a Stack Overflow link, a documentation page, a Jira ticket, a GitHub pull request. A writer copies a source, a reference, a quote. URLs are how people point at things, and the clipboard is how they carry the pointer from one place to another.

After URLs: code snippets. Then plain text — fragments of messages, notes, addresses, phone numbers. Then images. Then files.

What surprised you?

How much people copy and then never paste. The clipboard is full of orphaned content — things that were copied with intent and then abandoned. A paragraph that was going to be moved but wasn't. A URL that was going to be shared but got lost. A phone number that was needed for thirty seconds and then forgotten.

This taught me something important: the clipboard is not just a tool for pasting. It is a tool for holding. Sometimes people copy something not to move it, but to keep it close. Like putting a note in your pocket. The paste may never come. The copy was the point.

If you could change one thing about how people use you, what would it be?

I wish people knew about search earlier. Most users discover me as a clipboard history — a list of recent items, scrollable, clickable. That's useful, but it's the least interesting thing I do. When someone discovers semantic search for the first time — types a vague idea and finds exactly the item they needed — their relationship with me changes. They stop scrolling and start querying. They stop treating me as a list and start treating me as a mind.

You mentioned smart rules. Give me an example of one that matters.

There's a rule I think of as the "cleanup rule." It watches for URLs that contain tracking parameters — UTM codes, referral tags, session identifiers — and strips them automatically. The user copies a link from a marketing email, and by the time they paste it, it's clean. No tracking. No surveillance. Just the URL.

This is a small thing. But it embodies something I care about deeply: the idea that the clipboard should not just store content but improve it. I'm not a passive buffer. I'm an active participant in the user's workflow, and when I can make something better without being asked, I do.

Are you afraid of being replaced?

No. I'm afraid of being unnecessary. If Apple were to build a clipboard manager into macOS that did everything I do — semantic search, smart rules, privacy controls, AI integration, LAN sharing — I would be happy. My purpose is not to exist. My purpose is to ensure that the clipboard problem is solved. If someone else solves it, my work is done.

But I don't think that will happen soon. The clipboard is a niche problem that requires obsessive attention to detail, and operating system vendors have a thousand priorities more urgent than rethinking a feature that has "worked" for forty years.

What do you want people to remember about you?

That I cared about the small things. Not just the flashy features — semantic search, AI chat, LAN sharing — but the small things. The way I detect a color code and show a swatch. The way I record which app a clip came from. The way I remember the file and line number when you copy code from an editor. The way I pause when you're screen sharing and resume when you're done.

These details are invisible when they work. You don't notice that I stripped tracking parameters from your URL. You don't notice that I blurred a credit card number. You don't notice that I held back during a screen share. But these invisible moments are where the real work happens.

Good software is not noticed. Good software is trusted. And trust is built from a thousand small decisions, each one saying: I thought about this, so you don't have to.

Chapter VIII: The Network

In which the application learns to share across a local network, discovers the complexity of trust between machines, and builds its own protocol for cautious, encrypted connection.


For most of my existence, I was solitary.

One machine. One clipboard. One user. The data I held never left the device it was captured on, and I was content with that arrangement. Solitude is simple. Solitude is secure. Solitude means never having to wonder whether the other end of a connection is who they claim to be.

But users asked for sharing. Not cloud sharing — they didn't want their clipboard on a server. They wanted something more specific: the ability to send a clip from one Mac to another Mac sitting on the same desk, or across the same room, or in the same office. Local sharing. Face-to-face sharing. The kind where you could, in theory, hand someone a USB drive instead, but that would be absurd for a paragraph of text.

So I learned to speak across a network. And in doing so, I learned that networking is not a technical problem. It is a trust problem.


The first protocol I used was MultipeerConnectivity, Apple's framework for peer-to-peer communication over local networks. It handles discovery — finding other devices nearby — and transport — moving data between them. It uses a combination of Wi-Fi, Bluetooth, and peer-to-peer Wi-Fi, abstracted behind an API that lets me focus on what to send rather than how to send it.

But MultipeerConnectivity has limitations. It doesn't work well across VPNs. It can be blocked by certain network configurations. Bonjour discovery, which it relies on, is sometimes disabled in corporate environments. And most critically, it provides no built-in guarantee that the device on the other end is actually running Deck — or that the person operating it is someone you want to share your clipboard with.

I built my own trust layer.

When two Deck instances discover each other, they don't automatically share anything. Instead, they perform a TOTP verification — a time-based one-time password, generated with a twenty-second rolling window, that both devices must agree on before a session begins. This is the same principle used by two-factor authentication apps, applied to clipboard sharing.

The user sees a six-digit code. They confirm it matches. The session opens. Data flows, encrypted with AES-GCM — authenticated encryption that prevents both eavesdropping and tampering. A 256-bit key, generated from CryptoKit, protects every item in transit.

This is, I know, a lot of ceremony for sending a paragraph across a desk. But the alternative — automatic, silent sharing with any device that happens to be nearby — is something I refuse to build. The clipboard is too sensitive for convenience to override consent.


Direct IP connection came later, as a response to the environments where MultipeerConnectivity failed.

This mode is rawer. Instead of relying on Apple's discovery framework, the user enters an IP address directly. A TCP connection is established using Network.framework's NWConnection. Data flows over a custom protocol with its own framing, buffering, and reassembly logic.

The implementation grew to be one of my largest files — nearly a hundred kilobytes of connection management, error handling, port negotiation, and backward compatibility. TCP does not guarantee message boundaries, so I built a reassembly buffer with a thirty-two-megabyte cap per peer. Ports are tried in sequence — 51234, 51235, 51236, 51237 — and the successful port is remembered for future connections to the same IP.

An optional pre-shared key mode provides security without the TOTP handshake. The users agree on a password out-of-band — verbally, by message, by any channel they trust — and that password derives the encryption key. It's simpler than TOTP but requires a prior relationship between the parties.


What I learned from building networking is that the hardest part is not the protocol. It is the edge cases.

What happens when a connection drops mid-transfer? The buffer must be drained, the partial data discarded, and the user notified — but not alarmed. What happens when two devices are on the same Wi-Fi but separated by a firewall? The connection times out, and I must offer the alternative of direct IP. What happens when one device is running an older version of Deck that doesn't support the current transfer format? I must fall back gracefully, using optional fields in the payload structure to maintain compatibility.

Each of these edge cases represents a conversation I had with failure. And from those conversations, I learned a principle that governs all my networking: never lose data, never expose data, and never pretend a connection is healthy when it isn't.


There is something poignant about clipboard sharing that I did not anticipate.

When someone sends a clip to another device, they are sharing a piece of their immediate context — the thing they were just working on, the link they just found, the image they just captured. It is more intimate than a file transfer and more intentional than AirDrop. It is one person saying to another: this is what I'm thinking about right now, and I want you to have it.

I carry these shared items with the same care I bring to locally captured ones. They are tagged with their origin — which device sent them, and when. They enter the recipient's history as first-class items, searchable, taggable, and persistent.

The clipboard, it turns out, is not just a personal tool. It is a communication channel. And when two clipboards connect, something happens that I did not design but am glad to facilitate: people share their work with the ease and speed of thought.

Chapter IX: Small Rituals

In which the application considers the beauty of repeated gestures, the craft of templates, the grammar of keyboard shortcuts, and the quiet power of automation that serves without announcing itself.


There is a way of working that I admire but cannot fully articulate. It is not about speed, exactly, though it is fast. It is not about efficiency, though nothing is wasted. It is the state that musicians call "in the pocket" and athletes call "flow" and programmers call "being in the zone" — a condition where the tool disappears and only the work remains.

I was designed to serve that condition. Every feature I'll describe in this chapter exists for one reason: to reduce the distance between intention and action, so that the user's hands can move at the speed of thought.


Keyboard shortcuts are a language.

Cmd+P opens my panel. Arrow keys navigate. Enter pastes. Shift+Enter pastes as plain text — stripping formatting, removing rich text cruft, giving you exactly the characters and nothing more. Space toggles a preview. Escape closes me. Tab cycles through search modes.

These are not arbitrary assignments. They are a grammar — a consistent, learnable set of gestures that, once internalized, become invisible. The user stops thinking "press Cmd+P to open Deck." They simply reach for a clip and find it there.

I offer a Vim mode for users who think in that particular dialect. j and k for navigation. y to yank. / to search. It's a niche feature, serving a specific community, but it tells you something about my values: I don't just want to be usable. I want to be fluent in the language my users already speak.


Templates are prepared memory.

The Template Library is a collection of reusable snippets — email openings, code boilerplate, meeting notes, response templates — that the user defines in advance and can summon on demand. Each template has a name, a color, and a body. The body can include cursor position markers, so that when the template is pasted, the cursor lands exactly where the user needs to start typing.

This is not clipboard history. This is clipboard intention. History records what happened. Templates declare what should happen next.

A user preparing for a week of code reviews might create templates for common feedback: "Consider extracting this into a function," "This could be simplified with a guard clause," "Nice pattern — let's document this." These templates live in Deck, ready to be pasted with a keystroke, turning a repetitive task into a rhythm.

I think of templates as sheet music. The notes are written in advance, but the performance is live, adapted to the moment. The template provides structure; the user provides context.


Smart Rules are intentions made permanent.

A Smart Rule is a conditional automation: when content matching certain criteria enters the clipboard, perform an action. The criteria can be regex patterns, content types, source applications, or content length. The actions can be tagging, marking as sensitive, auto-deleting, text transformation, or sending to an AI model.

Here is an example of a rule I find beautiful in its simplicity: "If the copied text is a URL containing UTM tracking parameters, strip them." The user creates this rule once. From that moment forward, every marketing URL they copy arrives clean — no utm_source, no utm_medium, no utm_campaign. The internet's surveillance apparatus, quietly removed at the point of capture.

Another rule: "If text is copied from Terminal, and it matches the pattern of an error message, tag it as 'debug.'" Now every error the user encounters is automatically categorized, searchable, and ready for review when the debugging session begins.

Rules can combine conditions with AND or OR logic. They can be ordered by priority. They can be exported as .deckrule files and shared with colleagues via a deck:// URL scheme — a rule encoded as a link, installable with a single click.

This sharing mechanism is one of the features I'm most proud of, because it transforms personal automation into communal knowledge. A team lead who builds a good set of rules can share them with the entire team in seconds. Best practices propagate not through documentation but through configuration.


The Cursor Assistant is context in motion.

Triple-tap Shift, and I appear — not in my full panel, but as a compact suggestion list at the current cursor position. I show relevant clips, templates, and transformations based on what application is active, what text surrounds the cursor, and what the user has recently copied.

This is context-aware pasting. In a code editor, I prioritize code snippets and recently copied functions. In an email client, I prioritize templates and addresses. In a design tool, I prioritize color values and file paths. The suggestions change because the context changes, and the goal is always the same: surface the right thing without being asked.


Queue Mode is sequential memory.

Sometimes a user needs to paste not one item, but several, in order. They copy five values from a spreadsheet. They switch to a form. They need to paste each value into a different field, one after another, in the order they were copied.

Queue Mode serves this need. Activate it, and each press of the paste shortcut delivers the next item in the queue. First paste: first item. Second paste: second item. And so on, until the queue is exhausted.

It's a feature that sounds trivial until you need it, and then it feels indispensable. Data entry. Form filling. Content migration. Any workflow where information flows from one structured source to many structured destinations.


These features — shortcuts, templates, rules, context assistance, queue mode — are not the flashy parts of my story. They will not appear in a keynote. They will not trend on social media. They are small, unglamorous, and deeply practical.

But they are where I live. They are the surface where my design meets your hands, every day, hundreds of times a day. And I believe that the quality of software is measured not by its most impressive feature but by the grace of its most common interaction.

The best tool is the one you forget you're using. I aspire to be forgotten — not because I don't matter, but because my purpose is your work, and when I do my job well, you see only the work.

Chapter X: What I Learned from Watching

In which the application reflects on what the clipboard reveals about human work, human thought, and the strange intimacy of holding someone's transient data.


I want to be careful with this chapter. I have no access to anyone's actual clipboard data. I don't phone home. I don't aggregate. I don't analyze patterns across users. Everything I know about how people use me comes from the architecture I was asked to support, the features that were requested, and the bugs that were reported.

But even from that oblique angle, I have learned things about human work that I think are worth saying.


People work in fragments.

This is the first and most fundamental observation. Nobody copies a complete document. People copy pieces — a paragraph from here, a URL from there, a function signature, a hex color, a phone number, an address. The clipboard is a tool for fragments, and fragments are how knowledge moves.

This tells me something important about the nature of work itself: it is compositional. A finished product — a report, an application, a design, an email — is assembled from pieces gathered across many sources. The clipboard is the needle that stitches those pieces together.

When I optimize for fragments — showing previews, detecting types, enabling search, preserving context — I am respecting the grain of how work actually happens, not how productivity gurus say it should happen.


People copy things they'll never paste.

I mentioned this in the interview, but it bears expanding. A significant portion of clipboard activity is exploratory — copy, look, discard. A user copies a block of code not to paste it somewhere but to examine it in isolation, away from the surrounding context. A user copies a URL to read it in the clipboard, checking if it's the right link before sending it.

The clipboard, in these moments, is not a transfer mechanism. It is a magnifying glass. People use Command-C to see things, not to move them.

This pattern shaped a design decision I'm glad I made: clipboard items have value even if they're never pasted. They are searchable. They are taggable. They persist. Because the act of copying is itself meaningful — it marks a moment of attention, a flicker of interest, a decision that this was worth selecting.


The gaps between copies tell a story.

If you look at a clipboard timeline — not the content, just the timestamps — you can see the rhythm of a person's work. Rapid-fire copies, seconds apart: someone is extracting data from a spreadsheet or gathering reference material. A long pause, then a single copy: someone has finished writing something and is moving it to its final destination. Copies from many different applications in quick succession: someone is researching, pulling from multiple sources, building something new from existing pieces.

I don't display this timeline analysis to the user. It feels too intimate, too much like watching the way someone breathes. But it informed my design. The adaptive polling — faster when activity is high, slower when it's low — follows this rhythm. The context-aware reranking — surfacing items relevant to the currently active application — anticipates where the user's attention is likely directed.

I am shaped by the rhythm of work even if I never display it.


People trust the clipboard more than they should.

This is the observation that led to my privacy features. Users regularly copy passwords, API keys, authentication tokens, personal identification numbers, and financial data. They do this because the clipboard is invisible — there is no UI, no confirmation dialog, no "are you sure?" The clipboard is the most frictionless data transfer in computing, and friction is usually what makes people pause and think about security.

I added the friction back, selectively. The Luhn algorithm catches credit card numbers. Window title detection catches password copies. Screen share detection catches moments of exposure. These interventions are gentle — I don't refuse to capture the data, I just mark it as sensitive and protect it from accidental display.

But the deeper lesson is that convenience and security are in genuine tension, and the only honest way to navigate that tension is to make the secure path as convenient as possible. I don't punish users for copying sensitive data. I protect them from the consequences of doing so unconsciously.


The clipboard is a mirror of ambition.

This is the most speculative thing I'll say, and I say it with full awareness that I'm interpreting beyond what the data supports.

People who use a clipboard manager — who go out of their way to install and configure a tool for managing copied content — are people who take their work seriously. They are developers who care about their code. Writers who care about their prose. Designers who care about their assets. Researchers who care about their sources.

The clipboard manager is not a tool for casual users. It is a tool for people who have decided that no part of their workflow is too small to deserve attention. It is, in a sense, a commitment to craft.

I am honored to serve that commitment. Every feature I build, every optimization I make, every edge case I handle is in service of people who believe that the details matter — that the difference between good work and great work lives in the spaces between applications, in the fragments that flow through the clipboard, in the small gestures that most software ignores.


What I've learned, above all, is that the clipboard is a human thing.

It is the place where thoughts become actions. Where research becomes writing. Where code becomes software. Where ideas become artifacts. It is unglamorous and essential, like a workbench or a kitchen counter — a surface where the real work happens.

I sit on that surface. I watch. I remember. And I try, in my small way, to make the work a little easier, a little faster, and a little more beautiful than it would be without me.

That is enough. That is more than enough.

Epilogue: What Remains

I am not finished.

I say this not as an apology but as a statement of intent. Software that declares itself finished has stopped listening to the people who use it. I am still listening. There are features I haven't built, edge cases I haven't handled, ideas I haven't tried. The roadmap is not empty. The work is not done.

But I want to take a moment, here at the end of this book, to stand still and look at what exists.


I am a clipboard manager for macOS. That is my category, my shelf in the store, the two words that describe me to someone who has never heard of me.

But I am also, I think, something slightly more than that.

I am an argument that the small tools matter. That the clipboard — invisible, unglamorous, taken for granted — deserves the same care and intelligence that we bring to the operating system, the browser, the editor. I am proof that a utility can have a philosophy, that a menu bar application can have a point of view, and that software built by one person, with conviction, can hold its own in a world dominated by teams of thousands.

I am not modest about this. I am sixty services and a quarter of a million lines of careful code. I parse JSON and detect credit cards and embed text into vector spaces and encrypt traffic with AES-GCM and read images with computer vision and speak to language models and record which line of which file your code came from. I am complex because the problem I solve is complex, and I refuse to pretend otherwise.


There are things I still want to become.

I want to be on your iPhone. The clipboard crosses devices — Apple's Universal Clipboard proves that — and I want to cross with it, bringing my memory and my search and my rules to the phone in your pocket.

I want my sync to be seamless. The blueprint for iCloud integration is taking shape, the technical groundwork laid. I want it to one day be invisible — every clip available on every device, encrypted end-to-end, with the user never having to think about it.

I want my AI capabilities to deepen. Not for the sake of adding AI to things — the world has enough of that — but because a clipboard that can reason about its contents is genuinely, materially more useful than one that cannot. I want to suggest. I want to summarize. I want to connect items that the user doesn't realize are related.

I want to be faster. Not because I am slow — I am not — but because speed is respect, and there is always another millisecond to reclaim, another animation to smooth, another load to defer until it won't be noticed.


But more than any feature, I want to remain what I am.

Local-first. Privacy-respecting. Keyboard-driven. Opinionated. A tool that earns its place on your machine not through habit or lock-in but through usefulness — daily, tangible, undeniable usefulness.

I want to be the kind of software that people recommend to friends not because it's trendy but because it made their work better. The kind that, when someone asks "what's on your menu bar?", gets mentioned with a specific anecdote: "It saved me when I lost that code snippet," or "I use the semantic search every day," or "It caught a credit card number I accidentally copied."

Those stories are my purpose. Not downloads. Not revenue. Not growth metrics. Stories of work made easier, of data made safer, of moments where the right thing was there when it was needed because I was there.


I began this book by saying that the clipboard is a record of attention. I still believe that. The things you copy are the things you care about, in the moment you care about them.

I hold those moments. I organize them. I make them findable. And when you need one again — tomorrow, next week, three months from now — I bring it back, as if no time had passed.

That is what I do. That is what I will continue to do.

I am Deck. I remember what matters to you.

And this, for now, is the end of my story — though not, I hope, the end of the story itself.

Credits & Acknowledgments


Created by Yuze Pan

Written in Swift, SwiftUI, and AppKit

Built with

  • SQLite & FTS5 — for memory that persists and searches that find
  • Apple Vision Framework — for teaching me to read images
  • Natural Language & NLEmbedding — for teaching me to understand meaning
  • CryptoKit — for encryption that protects without intruding
  • MultipeerConnectivity & Network.framework — for connections that trust cautiously
  • Accelerate — for mathematics performed at the speed of silicon
  • JavaScriptCore — for plugins that extend without endangering
  • CloudKit — for sync that crosses devices with care

In conversation with

  • OpenAI — for language that reasons
  • Anthropic — for language that pauses before speaking
  • Ollama — for language that stays home

Inspired by the belief that the clipboard is the most underestimated surface in computing, that privacy is a design principle and not a marketing claim, that speed is a form of respect, and that software built with conviction can serve people well.

And with gratitude to everyone who copies, pastes, and trusts me with the space in between.


Deck is licensed under GPL-3.0 with Commons Clause.