Professional systems enforce state invariants — rules that must always hold. Python uses naming conventions
(_single, __double) and @property to create validated, read‑only, or computed attributes without breaking backward compatibility.
We’re looking for articles that go beyond basics: how to design classes that are impossible to misuse.
Instead of exposing raw attributes, wrap them in properties. This allows validation, logging, or lazy loading later.
Example: a BankAccount that never allows negative balance, implemented via a setter.
class BankAccount:
def __init__(self, balance):
self._balance = balance
@property
def balance(self):
return self._balance
@balance.setter
def balance(self, value):
if value < 0:
raise ValueError("invariant violated")
self._balance = value
What we'd like to see: real‑world invariants (e.g., business rules, state machines) and how to enforce them without performance penalty. Consider discussing descriptor protocol for reusable validation logic.
Another advanced technique is using __setattr__ to enforce invariants across multiple attributes, but care must be taken to avoid infinite recursion. A well‑crafted property setter is usually sufficient and more explicit.
Certified codebases rely on SOLID to survive decade‑long maintenance. We’re interested in Python‑specific interpretations:
how to apply SRP without exploding files, LSP with Protocol, and OCP via strategy objects.
User and UserRepository; don't mix database and domain. Example: a class that both validates and persists violates SRP.@abstractmethod to define extension points. Python's first‑class functions often allow OCP without subclass explosion.Protocol (PEP 544). Instead of one fat interface, define several small protocols.Call for content: We'd love a tutorial showing a messy legacy class and step‑by‑step refactoring to SOLID, with performance/maintainability metrics. Include before/after code and explain tradeoffs.
For instance, consider a class that sends notifications, logs, and accesses the database. Splitting these into separate classes with clear responsibilities (SRP) and injecting dependencies (DIP) makes the system testable and maintainable.
High‑level modules should not depend on low‑level details. In Python, both abc.ABC and typing.Protocol
(structural subtyping) let you define clean boundaries. We seek articles on real‑world dependency injection without frameworks,
and also with tools like dependency-injector.
from typing import Protocol
class Notifier(Protocol):
def send(self, message: str) -> None: ...
class EmailNotifier:
def send(self, message: str) -> None:
print(f"sending email: {message}")
class AlertService:
def __init__(self, notifier: Notifier):
self._notifier = notifier
We’d also welcome pieces on testing with mock protocols, or using inject decorators. Protocols allow static checking without inheritance, making them ideal for adapting external libraries.
Advanced topics: using functools.partial for partial application of dependencies, or building a simple dependency injection container with dictionaries and factories.
Patterns capture proven solutions. Python’s first‑class functions often simplify these patterns. We're after practical, non‑contrived examples —
e.g., using functools.singledispatch for Visitor, or context managers for RAII.
A factory method or class decouples client from concrete classes. Useful when object setup is complex or varies with context.
class SerializerFactory:
def get_serializer(self, format):
if format == 'json': return JsonSerializer()
if format == 'xml': return XmlSerializer()
raise ValueError('unknown format')
Consider using dict of formats to classes for a more extensible factory.
Define a family of algorithms, make them interchangeable. In Python you can pass a callable directly.
class Discounter:
def __init__(self, strategy): self._strategy = strategy
def apply(self, amount): return self._strategy(amount)
# usage: Discounter(lambda a: a * 0.9) # 10% off
Ideal article: compare classic GoF implementation vs Pythonic functional approach, with tradeoffs. Also discuss when to use Enum of strategies.
The Visitor pattern can be elegantly implemented with functools.singledispatch to handle different types without modifying the classes themselves.
Python classes normally store attributes in a per‑instance __dict__, which consumes significant memory (about 56 bytes per attribute).
Using __slots__ tells Python to reserve exactly enough space, reducing memory usage by 40–60% for large numbers of instances.
# typical class
class Point3D:
__slots__ = ('x', 'y', 'z')
def __init__(self, x, y, z):
self.x, self.y, self.z = x, y, z
# memory comparison: 10M points — with slots: ~320MB, without slots: ~720MB
pympler or tracemalloc to measure real footprint. Show slots with inheritance, weakref, or when to avoid.
Slots also provide faster attribute access. However, they prevent dynamic attribute creation and can complicate multiple inheritance. A deep dive could explore __weakref__ and __dict__ in slots.
In domain‑driven design, the Repository pattern mediates between the domain and data mapping. It provides a collection‑like interface for aggregates. Unit of Work tracks changes and commits them atomically. We're seeking articles with SQLAlchemy 2.0 examples or async repositories.
class Repository(ABC):
@abstractmethod def add(self, entity): ...
@abstractmethod def get(self, id): ...
class SqlAlchemyRepository(Repository):
def __init__(self, session): self.session = session
def add(self, entity): self.session.add(entity)
def get(self, id): return self.session.get(entity_class, id)
We especially appreciate articles showing integration with FastAPI or background task workers. The Unit of Work pattern can be combined with SQLAlchemy's session to provide atomic operations.
Advanced example: using generic repositories with TypeVar to avoid boilerplate, and implementing specification pattern for queries.
Modern Python codebases enforce typing with mypy or pyright. They catch interface violations and missing attributes before runtime.
A typical pyproject.toml enforces disallowing untyped defs, and checks against protocols.
# mypy --strict will demand every function has type annotations
def process(data: list[int]) -> dict[str, float]:
return {"mean": sum(data)/len(data)}
We'd like to see articles on gradual typing in legacy codebases, custom mypy plugins, or integrating pyright in CI for open source. Discussing the tradeoffs of strict mode vs productivity is valuable.
Another angle: using typing.overload to express complex function signatures, or creating type stubs for untyped libraries.
We publish 2‑3 articles per month. If you’d like to write for us, please submit a proposal with:
Accepted articles will be featured on our homepage and promoted via social media. We offer light editing and feedback to ensure technical accuracy. Join us to share your knowledge with the Python community!