5 Signs You’ve Become an Advanced Pythonista Without Even Realizing It

5 Signs You’ve Become an Advanced Pythonista Without Even Realizing It

[ad_1]

Image by Charles Thonney from Pixabay

Introduction

You’ve been programming in Python for a while now, whipping up scripts and solving problems left and right. You think you’re pretty good, don’t you? Well, hold on to your hats, folks, because you might just be an advanced Pythonista without even realizing it!

From closure to context managers, I’ve got a list of advanced Python features that will make you say, “I’ve been using that all along!”.

Even if these things are new to you, you’ll have an excellent checklist to complete to take your game to the next level.

1. Scope

A critical aspect of advanced Python programming is deep familiarity with the concept of scope.

Scope defines the order in which the Python interpreter looks up names in a program. Python scope follows the LEGB rule (local, enclosing, global, and built-in scopes). According to the rule, when you access a name (it can be anything, a variable, a function, or a class), the interpreter looks for it in local, enclosing, global, and built-in scopes, in order.

Let’s see examples to understand each level better.

Example 1 — Local Scope

def func():
x = 10
print(x)

func() # 10
print(x) # Raises NameError, x is only defined within the scope of func()

Here, x is only defined in the scope that is local to func. That’s why it isn’t accessible anywhere else in the script.

Example 2 — Enclosing Scope

def outer_func():
x = 20
def inner_func():
print(x)
inner_func()

outer_func() # 20

Enclosing scope is the intermediary scope between local and global scopes. In the example above, x is in the local scope of outer_func. On the other hand, x is in the enclosing scope relative to the nested inner_func function. Local scope always has read-only access to the enclosing scope.

Example 3 — Global Scope

x = 30

def func():
print(x)

func() # 30

Here, x and func are defined in the global scope, which means they can be read from anywhere in the current script.

To modify them in smaller levels of scope (local and enclosing), they should be accessed with the global keyword:

def func2():
global x
x = 40
print(x)

func2() # 40
print(x) # 40

Example 4 — Built-in scope

Built-in scope includes every already-defined library, class, function, and variable that doesn’t require explicit import statements. Some examples of built-in functions and variables in Python include print, len, range, str, int, float, etc.

2. Function closure

A firm grasp of scope opens the doors to another important concept — function closure.

By default, after the function finishes execution, it returns to a blank state. This means its memory is wiped of all of its past arguments.

def func(x):
return x ** 2

func(3)

9
print(x) # NameError

Above, we assigned the value of 3 to x but the function forgot it after execution. What if we don’t want it to forget the value of x?

This is where function closure comes into play. By defining a variable in the enclosing scope of some inner function, you can store it in the inner function’s memory even after the function returns.

Here is a simple example function that counts the number of times it was executed:

def counter():
count = 0
def inner():
nonlocal count
count += 1
return count
return inner

# Return the inner function
counter = counter()
print(counter()) # 1
print(counter()) # 2
print(counter()) # 3

1
2
3

By all rules of Python, we should have lost the counter variable after the first execution. But since it is in the inner function’s closure, it will stay there till you close the session:

counter.__closure__[0].cell_contents
3

3. Decorators

Function closures have more serious applications than simple counters. One of them is creating decorators. A decorator is a nested function you can add to other functions to enhance or even modify their behavior.

For example, below, we are creating a caching decorator that remembers the state of every single positional and keyword argument of a function.

def stateful_function(func):
cache = {}
def inner(*args, **kwargs):
key = str(args) + str(kwargs)
if key not in cache:
cache[key] = func(*args, **kwargs)
return cache[key]
return inner

The stateful_function decorator can now be added to computation-heavy functions that might be reused on the same arguments. The example is the following recursive Fibonacci function that returns the nth number in the sequence:

%%time

@stateful_function
def fibonacci(n):
if n <= 0:
return 0
elif n == 1:
return 1
else:
return fibonacci(n-1) + fibonacci(n-2)

fibonacci(1000)

CPU times: user 1.53 ms, sys: 88 µs, total: 1.62 ms
Wall time: 1.62 ms[OUT]:

43466557686937456435688527675040625802564660517371780402481729089536555417949051890403879840079255169295922593080322634775209689623239873322471161642996440906533187938298969649928516003704476137795166849228875

We found the humongous 1000th number in the Fibonacci sequence in a fraction of a second. Here is how much the same process would take without the caching decorator:

%%time

def fibonacci(n):
if n <= 0:
return 0
elif n == 1:
return 1
else:
return fibonacci(n-1) + fibonacci(n-2)

fibonacci(40)

CPU times: user 21 s, sys: 0 ns, total: 21 s
Wall time: 21 s[OUT]:

102334155

It took 21 seconds to calculate the 40th number. It would take days to calculate the 1000th without caching.

You can learn the hairy details of how to create your own decorators (including scope and closures) in my separate post:

4. Generators

Generators are powerful constructs in Python that allows processing of large amounts of data efficiently.

Let’s say you have a 10GB log file after the crash of some software. To find out what went wrong, you have to efficiently sift through it in Python.

The worst way to do this is to read the whole file like below:

with open("logs.txt", "r") as f:
contents = f.read()

print(contents)

Since you go through the logs line by line, you don’t need to read all 10GBs, just chunks of it at a time. This is where you can use generators:

def read_large_file(filename):
with open(filename) as f:
while True:
chunk = f.read(1024)
if not chunk:
break
yield chunk # Generators are defined with `yield` instead of `return`

for chunk in read_large_file("logs.txt"):
process(chunk) # Process the chunk

Above, we defined a generator that iterates the lines of the log file only 1024 at a time. As a result, the for loop at the end is highly efficient. In every iteration of the loop, only 1024 lines of the file are in memory. The previous chunks are discarded, while the rest are only loaded as needed.

Another feature of generators is the ability to yield an element at a time, even outside loops, with the next function. Below, we are defining a blazing-fast function that generates the Fibonacci sequence.

To create the generator, you call the function once and call next on the resulting object:

def fibonacci():
a, b = 0, 1
while True:
yield a
a, b = b, a + b

fib = fibonacci()

type(fib)

generator
print(next(fib)) # 0
print(next(fib)) # 1
print(next(fib)) # 1
print(next(fib)) # 2
print(next(fib)) # 3

You can read the following post on generators to learn more.

5. Context managers

You must have been using context managers for a long time now. They allow developers to manage resources efficiently, like files, databases, and network connections. They automatically open and close resources, leading to clean and error-free code.

But, there is a big difference between using context managers and writing your own. When done right, they allow you to abstract a lot of boilerplate code on top of their original functionality.

One popular example of a custom context manager is a timer:

import time

class TimerContextManager:
"""
Measure the time it takes to run
a block of code.
"""
def __enter__(self):
self.start = time.time()

def __exit__(self, type, value, traceback):
end = time.time()
print(f"The code took {end - self.start:.2f} seconds to execute.")

Above, we are defining a TimerContextManager class that will serve as our future context manager. Its __enter__ method defines what happens when we enter the context with the with keyword. In this case, we start the timer.

In __exit__, we go out of the context, stop the timer, and report elapsed time.

with TimerContextManager():
# This code is timed
time.sleep(1)
The code took 1.00 seconds to execute.

Here is a more complex example that enables locking resources so that they can be used by one process at a time.

import threading

lock = threading.Lock()

class LockContextManager:
def __enter__(self):
lock.acquire()

def __exit__(self, type, value, traceback):
lock.release()

with LockContextManager():
# This code is executed with the lock acquired
# Only one process can be inside this block at a time

# The lock is automatically released when the with block ends, even if an error occurs

If you want a more gentle introduction to context managers, check out my article on the topic.

If you want to go down the rabbit hole and learn everything about them, here is another excellent RealPython article.

Conclusion

There you have it, folks! How many times did you say, “I knew that!”? Even if it wasn’t that many times, you now know the things to learn to become advanced.

Don’t be afraid of the discomfort that comes with learning new things. Just remember, with great power comes (I won’t say it!) more challenging bugs to fix. But hey, you are a pro now, what’s a little debugging to you?

Thank you for reading!

More stories from me…

[ad_2]
Source link

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *