Add additional modules for use in transformation

I am writing a memgraph transformation in python.

When I import modules such as “requests” or “networkx”, the transformation works as expected.

I have avro data w/ schema registry, so I need to deserialize it. I followed the memgraph example here: Import Avro data | Memgraph Docs

When I save the transformation with those imports, I receive the error:

[Error] Unable to load module "/memgraph/internal_modules/test_try_plz.py";
Traceback (most recent call last): File "/memgraph/internal_modules/test_try_plz.py", line 4,
in <module> from confluent_kafka.schema_registry import SchemaRegistryClient ModuleNotFoundError:
No module named 'confluent_kafka' . For more details, visit https://memgr.ph/modules.

How can I update my transform or memgraph instance to include the confluent_kafka module?

The link provided in the answer did not provide any leads, at least to me.

The problem is that the environment in which you’re running Memgraph does not have confluent_kafka installed. To do that, you have to define your project via Dockerfile or Docker Compose and install the dependencies inside the container. Here is the example of the requirements file, which is being installed via Dockerfile. This Memgraph container is inside the project defined with the Docker Compose file. The reason why you can import modules such as requests or networkx is because they are Memgraph’s dependencies and are installed along with it.

I will check why confluent_kafka is not preinstalled, but I see that the docs you linked are from version 2.3.0, so please refer to the latest docs in the future. I will get back to you on Avro.