Fluent Python: Clear, Concise, and Effective Programming
Rate it:
Open Preview
55%
Flag icon
If an infix operator method raises an exception, it aborts the operator dispatch algorithm. In the particular case of TypeError, it is often better to catch it and return NotImplemented. This allows the interpreter to try calling the reversed operator method, which may correctly handle the computation with the swapped operands, if they are of different types.
55%
Flag icon
The @ sign is well-known as the prefix of function decorators, but since 2015, it can also be used as an infix operator.
55%
Flag icon
The zip built-in accepts a strict keyword-only optional argument since Python 3.10. When strict=True, the function raises ValueError when the iterables have different lengths.
56%
Flag icon
In the face of ambiguity, refuse the temptation to guess.
56%
Flag icon
The in-place special methods should never be implemented for immutable types like our Vector class.
57%
Flag icon
Python imposes on operator overloading: no redefining of operators in the built-in types themselves, overloading limited to existing operators, with a few operators left out (is, and, or, not).
57%
Flag icon
unary and infix operators are supposed to produce results by creating new objects, and should never change their operands. To support operations with other types, we return the NotImplemented special value—not an exception—allowing the interpreter to try again by swapping the operands and calling the reverse special method for that operator (e.g., __radd__).
57%
Flag icon
Two successful modern languages that compile to binary executables made opposite choices: Go doesn’t have operator overloading, but Rust does.
57%
Flag icon
When I see patterns in my programs, I consider it a sign of trouble. The shape of a program should reflect only the problem it needs to solve. Any other regularity in the code is a sign, to me at least, that I’m using abstractions that aren’t powerful enough—often that I’m generating by hand the expansions of some macro that I need to write.
57%
Flag icon
Every standard collection in Python is iterable.
57%
Flag icon
An iterable is an object that provides an iterator, which Python uses to support operations like: for loops List, dict, and set comprehensions Unpacking assignments Construction of collection instances
57%
Flag icon
Whenever Python needs to iterate over an object x, it automatically calls iter(x). The iter built-in function: Checks whether the object implements __iter__, and calls that to obtain an iterator. If __iter__ is not implemented, but __getitem__ is, then iter() creates an iterator that tries to fetch items by index, starting from 0 (zero). If that fails, Python raises TypeError, usually saying 'C' object is not iterable, where C is the class of the target object.
58%
Flag icon
As of Python 3.10, the most accurate way to check whether an object x is iterable is to call iter(x) and handle a TypeError exception if it isn’t. This is more accurate than using isinstance(x, abc.Iterable), because iter(x) also considers the legacy __getitem__ method, while the Iterable ABC does not.
58%
Flag icon
As usual with iterators, the d6_iter object in the example becomes useless once exhausted. To start over, we must rebuild the iterator by invoking iter() again.
58%
Flag icon
iterable Any object from which the iter built-in function can obtain an iterator. Objects implementing an __iter__ method returning an iterator are iterable. Sequences are always iterable, as are objects implementing a __getitem__ method that accepts 0-based indexes.
58%
Flag icon
It’s important to be clear about the relationship between iterables and iterators: Python obtains iterators from iterables.
58%
Flag icon
Python’s standard interface for an iterator has two methods: __next__ Returns the next item in the series, raising StopIteration if there are no more. __iter__ Returns self; this allows iterators to be used where an iterable is expected, for example, in a for loop.
58%
Flag icon
the best way to check if an object x is an iterator is to call isinstance(x, abc.Iterator). Thanks to Iterator.__subclasshook__, this test works even if the class of x is not a real or virtual subclass of Iterator.
58%
Flag icon
Because the only methods required of an iterator are __next__ and __iter__, there is no way to check whether there are remaining items, other than to call next() and catch StopIteration. Also, it’s not possible to “reset” an iterator. If you need to start over, you need to call iter() on the iterable that built the iterator in the first place. Calling iter() on the iterator itself won’t help either, because—as mentioned—Iterator.__iter__ is implemented by returning self, so this will not reset a depleted iterator.
58%
Flag icon
iterators are also iterable, but iterables are not iterators.
58%
Flag icon
Any Python function that has the yield keyword in its body is a generator function: a function which, when called, returns a generator object. In other words, a generator function is a generator factory.
59%
Flag icon
A generator function builds a generator object that wraps the body of the function. When we invoke next() on the generator object, execution advances to the next yield in the function body, and the next() call evaluates to the value yielded when the function body is suspended. Finally, the enclosing generator object created by Python raises StopIteration when the function body returns, in accordance with the Iterator protocol.
59%
Flag icon
It’s confusing to say a generator “returns” values. Functions return values. Calling a generator function returns a generator. A generator yields values. A generator doesn’t “return” values in the usual way: the return statement in the body of a generator function causes StopIteration to be raised by the generator object.
59%
Flag icon
To iterate, the for machinery does the equivalent of g = iter(gen_AB()) to get a generator object, and then next(g) at each iteration.
59%
Flag icon
The Iterator interface is designed to be lazy: next(my_iterator) yields one item at a time. The opposite of lazy is eager: lazy evaluation and eager evaluation are technical terms in programming language theory.
59%
Flag icon
Generator expressions are syntactic sugar: they can always be replaced by generator functions, but sometimes are more convenient.
59%
Flag icon
My rule of thumb in choosing the syntax to use is simple: if the generator expression spans more than a couple of lines, I prefer to code a generator function for the sake of readability.
59%
Flag icon
iterator General term for any object that implements a __next__ method. Iterators are designed to produce data that is consumed by the client code, i.e., the code that drives the iterator via a for loop or other iterative feature, or by explicitly calling next(it) on the iterator—although this explicit usage is much less common. In practice, most iterators we use in Python are generators.
59%
Flag icon
generator An iterator built by the Python compiler. To create a generator, we don’t implement __next__. Instead, we use the yield keyword to make a generator function, which is a factory of generator objects. A generator expression is another way to build a generator object. Generator objects provide __next__, so they are iterators.
60%
Flag icon
when implementing generators, know what is available in the standard library, otherwise there’s a good chance you’ll reinvent the wheel.
62%
Flag icon
The yield from expression syntax was introduced in Python 3.3 to allow a generator to delegate work to a subgenerator.
62%
Flag icon
Before yield from was introduced, we used a for loop when a generator needed to yield values produced from another generator:
62%
Flag icon
gen is the delegating generator, and sub_gen is the subgenerator. Note that yield from pauses gen, and sub_gen takes over until it is exhausted. The values yielded by sub_gen pass through gen directly to the client for loop. Meanwhile, gen is suspended and cannot see the values passing through it. Only when sub_gen is done, gen resumes.
62%
Flag icon
When the subgenerator contains a return statement with a value, that value can be captured in the delegating generator by using yield from as part of an expression.
63%
Flag icon
Note that the type Iterator is used for generators coded as functions with yield, as well as iterators written “by hand” as classes with __next__.
63%
Flag icon
abc.Iterator[str] is consistent-with abc.Generator[str, None, None],
63%
Flag icon
Generators able to consume and return values are coroutines,
63%
Flag icon
A coroutine is really a generator function, created with the yield keyword in its body. And a coroutine object is physically a generator object. Despite sharing the same underlying implementation in C, the use cases of generators and coroutines in Python are so different that there are two ways to type hint them:
63%
Flag icon
Generators produce data for iteration Coroutines are consumers of data To keep your brain from exploding, don’t mix the two concepts together Coroutines are not related to iteration Note: There is a use of having `yield` produce a value in a coroutine, but it’s not tied to iteration. ...more
63%
Flag icon
no instance attributes or closures are needed to keep the context while the coroutine is suspended waiting for the next .send(). That’s why coroutines are attractive replacements for callbacks in asynchronous programming—they keep local state between activations.
63%
Flag icon
Calling next() or .send(None) to advance to the first yield is known as “priming the coroutine.”
63%
Flag icon
After each activation, the coroutine is suspended precisely at the yield keyword, waiting for a value to be sent. The line coro_avg.send(10) provides that value, causing the coroutine to activate. The yield expression resolves to the value 10, assigning it to the term variable. The rest of the loop updates the total, count, and average variables. The next iteration in the while loop yields the average, and the coroutine is again suspended at the yield keyword.
63%
Flag icon
We don’t usually need to terminate a generator, because it is garbage collected as soon as there are no more valid references to it. If you need to explicitly terminate it, use the .close() method,
64%
Flag icon
A delegating generator can get the return value of a coroutine directly using the yield from syntax,
64%
Flag icon
In practice, productive work with coroutines requires the support of a specialized framework.
64%
Flag icon
It makes sense that these are covariant, because any code expecting a coroutine that yields floats can use a coroutine that yields integers. That’s why Generator is covariant on its YieldType parameter. The same reasoning applies to the ReturnType parameter—also covariant.
64%
Flag icon
The integration of the Iterator pattern in the semantics of Python is a prime example of how design patterns are not equally applicable in all programming languages.
64%
Flag icon
Although difficult to use in practice, classic coroutines are the foundation of native coroutines, and the yield from expression is the direct precursor of await.
64%
Flag icon
The with statement sets up a temporary context and reliably tears it down, under the control of a context manager object. This prevents errors and reduces boilerplate code, making APIs at the same time safer and easier to use.
64%
Flag icon
Context manager objects exist to control a with statement, just like iterators exist to control a for statement.