Python has a rich ecosystem of quality, mature tooling: linting, formatting, type checking, and the like. Each has decent performance, but what if the tooling was fast? Really fast – as in, instantaneous?

This is the argument posed by Charlie Marsh when he introduced Ruff: a linter with the engine written in Rust. The performance numbers were incredible from the start, as was the reception in the Python community. Ruff is developing quickly – not just filling in the details, but expanding beyond just linting.

PyCharm is hosting Charlie for a February 14th webinar. We caught up with Charlie to collect some background on him and the project.

Register for the webinar

Quick hit: Why should people care about Ruff?

Ruff’s “flagship feature” is performance – it aims to be orders of magnitude faster than existing tools. Even if you don’t think of your existing linter as slow, you might be surprised by how different it feels to use Ruff. Even on very large projects, Ruff can give you what is effectively an instant feedback loop.

But even beyond performance, I think Ruff offers something pretty different: it’s a single tool that can replace dozens of existing tools, plugins, and packages – it’s one tool to learn and configure that gives you access to hundreds of rules and automated code transformations, with new capabilities shipping every day.

To get a detailed overview of Ruff, check out this recent Talk Python podcast episode.

Now, some introductions: Tell us about Charlie Marsh.

I’ve kind of inadvertently spent my career jumping between programming ecosystems. At Khan Academy, I worked professionally on web, Android, iOS, and with Python on the back-end. At Spring Discovery, I joined as the second engineer and led the development of our software,

data, and machine learning platforms, which meant mostly Python with frequent detours into web (and, later on, Rust).

Moving between these ecosystems has really influenced how I think about tooling – I see something that the web does well, and I want to bring it to Python, or vice versa. Ruff is based on a lot of those observations, and motivated by a lot of my own experiences at Spring – it’s the tooling I wish I’d had.

Outside of work: I live in Brooklyn, NY with my wife and four-month-old son.

The JavaScript world has embraced fast tooling. Can you describe the what/why/how?

From my perspective, the first project that comes to mind is esbuild, a fast JavaScript “bundler” written in Go. I always loved this line from their homepage (“The main goal of the esbuild bundler project is to bring about a new era of build tool performance”), because along with being fast, esbuild was able to change expectations around how fast tooling could be.

Later on came swc, a TypeScript and JavaScript compiler written in Rust. And since then, we’ve seen Turbopack, Parcel, Bun, and more, all of which contain native implementations of functionality that was once implemented in pure JavaScript.

Ignoring the details of what these tools actually do, the higher-level thesis is that web tooling doesn’t have to be written in JavaScript. Instead, it can be written in lower-level, more performant languages, with users reaping the benefits of greatly-increased performance.

(We’ve seen this trend outside of the JavaScript world too. For example, the rubyfmt autoformatter was recently rewritten in Rust.)

How does this translate to the what/why/how for Python?

Most Python tooling is written in Python. There are of course exceptions: Mypy is compiled to a C extension via Mypyc; Pyright is written in Node; the scientific Python stack like NumPy is written in C and other languages; much of CPython itself is written in C and highly optimized; etc. But if you look at, for example, the existing linters, or the modal popular Python developer tool, it’s probably written in Python.

That’s not meant as a criticism – I’m not a Rust maximalist. That is: I don’t believe that every piece of software ever should be rewritten in Rust. (If I did, it’d be a strange choice to work on Python tooling!) But if you buy into the lessons learned from the web ecosystem, it suggests that there’s room to innovate on Python tooling by, in some cases, exploring implementations in more performant languages, and exposing those implementations via Python interfaces.

And if you accept that premise, then Rust is a natural fit, since Python has a really good story when it comes to integrating and interoperating with Rust: you can ship pure Rust and mixed Rust-Python projects to PyPI using Maturin, and your users can install them with pip just like any other Python package; you can implement your “Python” library in Rust, and expose it on the Python side with PyO3. It feels magical, and my experience with those tools at Spring Discovery were a big part of why I considered building a Rust-based Python linter in the first place.

While the Rust-Python community still feels nascent in some ways, I think Ruff is part of a bigger trend here. Polars is another good example of this kind of thinking, where they’ve built an extremely performant DataFrame library in Rust, and exposed it with Python bindings.

You’ve been on a performance adventure. What has surprised you?

Ideas are great, but benchmarks are where they meet reality, and are either proven or disproven. Seemingly small optimizations can have a significant impact. However, not all apparent optimizations end up improving performance in practice.

When I have an idea for an optimization, my goal is always to benchmark it as quickly as possible, even if it means cutting corners, skipping cases, writing messy code, etc. Sometimes, the creative and exciting ideas make no measurable difference. Other times, a rote change can move the needle quite a bit. You have to benchmark your changes, on “real” code, to be certain.

Another project-related tension that I hadn’t anticipated is that, if you really care about performance, you’re constantly faced with decisions around how to prioritize. Almost every new feature will reduce performance in some way, since you’re typically doing more work than you were before. So what’s the acceptable limit? What’s the budget? If you make something slower, can you speed up something else to balance the scales?

Is it true you might be thinking beyond linting?

It’s absolutely true! I try to balance being open about the scope of my own interests against a fear of overcommitting and overpromising.

But with that caveat… My dream is for Ruff to evolve into a unified linter, autoformatter, and type checker – in short, a complete static analysis toolchain. There are significant benefits to bundling all of that functionality: you can do less repeated work, and each tool can do a better job than if any of them were implemented independently.

I think we’re doing a good job with the linting piece, and I’ve been starting to work on the autoformatter. I’ll admit that I don’t know anything about building a type checker, except that it’s complicated and hard, so I consider that to be much further out. But it’s definitely on my mind.

You’re a PyCharm user. We also think a lot about tooling. What’s your take on the Python world’s feelings about tooling?

I talk to a lot of people about Python tooling, and hear a lot of complaints – but those complaints aren’t always the same.

Even still, I look back just a few years and see a lot of progress, both technologically and culturally – better tools, better practices (e.g., autoformatters, lockfiles), PEPs that’ve pushed standards forward. So I try to remain optimistic, and view every complaint as an opportunity.

On a more specific note: there’s been a lot of discussion around packaging lately, motivated by the Python Packaging Survey that the PSF facilitated. (Pradyun Gedam wrote a nice blog post in response.) One of the main critiques was around the amount of fragmentation in the ecosystem: you need to use a bunch of different tools, and there are multiple tools to do any one job. The suggestion of consolidating a lot of that functionality into a single, blessed tool (like Rust’s cargo) came up a few times.

I tend to like bundling functionality; but I also believe that competition can push tooling forward. You see this a lot in the web ecosystem, where npm, yarn, pnpm, and bun are all capable of installing packages. But they all come with different tradeoffs.

I’d like to see tools in the Python ecosystem do a better job of articulating those tradeoffs. Python is used by so many different audiences and userbases, for so many different things. Who’s your target user? Who’s not? What tradeoffs are they making by choosing your tool over another?

Register for the webinar

To help fill the seats: give us a teaser for what folks will see in the webinar.

I’d like to give viewers a sense for what it feels like to use Ruff, and the kinds of superpowers it can give you as a developer – not just in terms of performance, but code transformation, and simplicity of configuration too.

Categories: Python