Setting Up a REPL in a NestJS Project with Mikro-ORM: A Django Shell Equivalent
2024-10-21
2024-10-21
After spending considerable time working with the Python framework Django, I've recently ventured into the Node.js world with NestJS. One of the features I deeply missed from Django was the Django Shell. It was an incredibly useful tool that allowed me to interact with my application in a Python shell, test out code snippets, and manipulate data directly using the Django ORM.
In the NestJS ecosystem, I was searching for a similar interactive environment and discovered that what I was looking for is called a REPL (Read-Evaluate-Print Loop). A REPL provides an interactive shell where you can execute code in real-time within the context of your application.
In this post, I'll show you how to set up a REPL in a NestJS project that uses Mikro-ORM, drawing from my experience adapting code from this GitHub repository.
A REPL is invaluable for:
Here's how you can set up a REPL in your NestJS project:
repl.ts
FileIn the root of your project, create a file named repl.ts
with the following content:
import 'tsconfig-paths/register';
import { repl } from '@nestjs/core';
import { AppModule } from './app.module';
import { MikroORM } from '@mikro-orm/core';
import { commonMikroOrmConfig } from './mikro-orm.config';
import { Post } from './posts/post.entities';
async function bootstrap() {
const replServer = await repl(AppModule);
const { context } = replServer;
const orm = await MikroORM.init({
...commonMikroOrmConfig,
allowGlobalContext: true,
entitiesTs: ['./**/*.entities.ts'],
entities: ['./dist/**/*.entities.js'],
discovery: {
warnWhenNoEntities: false,
},
});
// Add your entities and ORM to the REPL context for easy access
context.Post = Post;
context.orm = orm;
context.em = orm.em;
}
bootstrap();
repl
from @nestjs/core
and MikroORM
from @mikro-orm/core
.bootstrap
function.em
), and any entities (like Post
) to the REPL context for easy access.Run the following command in your terminal:
npm run start -- --entryFile repl
This tells NestJS to use repl.ts
as the entry file instead of the default main.ts
.
Once the REPL starts, you'll see a prompt like this:
[info] MikroORM successfully connected to database blog_db on postgresql://blog_user:*****@127.0.0.1:5432
>
Now you can interact with your application. Here's an example of querying all Post
entities:
> const posts = await em.find(Post, {});
[query] select "p0".* from "post" as "p0" [took 5 ms, 2 results]
> posts
[
{
id: 1,
title: 'First Post',
content: 'This is the first post.',
createdAt: 2023-10-21T12:34:56.789Z
},
{
id: 2,
title: 'Second Post',
content: 'This is the second post.',
createdAt: 2023-10-22T08:15:30.123Z
}
]
Post
, User
, or any other entities you've added to the context.em
is available for database operations.Setting up a REPL in your NestJS project with Mikro-ORM bridges the gap between Django's interactive shell and the Node.js world. It enhances productivity by allowing real-time interaction with your application's context and database.
Feel free to explore and extend this setup by adding more entities or custom services to the REPL context. Happy coding!
References:
When building an application with NestJS and Mikro-ORM in TypeScript, ensuring proper testing is essential to maintain code quality and reliability. In this post, I will cover three main testing strategies for database-related operations, each with its pros and cons.
In this approach, you set up an in-memory SQLite database during tests to simulate persistence without interacting with a real database.
Pros:
Cons:
import { MikroORM } from '@mikro-orm/core';
import { User } from './user.entity'; // example entity
import { SqliteDriver } from '@mikro-orm/sqlite';
describe('User Service - In-Memory DB', () => {
let orm: MikroORM;
beforeAll(async () => {
orm = await MikroORM.init({
entities: [User],
dbName: ':memory:',
type: 'sqlite',
});
const generator = orm.getSchemaGenerator();
await generator.createSchema();
});
afterAll(async () => {
await orm.close(true);
});
it('should persist and retrieve a user entity', async () => {
const userRepo = orm.em.getRepository(User);
const user = userRepo.create({ name: 'John Doe' });
await userRepo.persistAndFlush(user);
const retrievedUser = await userRepo.findOne({ name: 'John Doe' });
expect(retrievedUser).toBeDefined();
expect(retrievedUser.name).toBe('John Doe');
});
});
This setup is relatively straightforward, but keep in mind the limitations regarding database compatibility. Note also this approach is not recommended by the Mikro-ORM creator but in the Mikro-ORM repo it is used anyway for some tests.
Another option is to initialize Mikro-ORM with the same driver you'd use in production but prevent it from connecting to a real database by setting connect: false
. This can be a quick setup, especially when you don't need to run real database queries.
Pros:
Cons:
import { MikroORM } from '@mikro-orm/core';
import { User } from './user.entity';
describe('User Service - No DB Connection', () => {
let orm: MikroORM;
beforeAll(async () => {
orm = await MikroORM.init({
entities: [User],
dbName: 'test-db',
type: 'postgresql', // same as production
connect: false, // prevent real connection
});
});
it('should mock user creation and retrieval', async () => {
const mockUser = { id: 1, name: 'Mock User' };
const userRepo = orm.em.getRepository(User);
jest.spyOn(userRepo, 'persistAndFlush').mockImplementation(async () => mockUser);
jest.spyOn(userRepo, 'findOne').mockResolvedValue(mockUser);
await userRepo.persistAndFlush(mockUser);
const foundUser = await userRepo.findOne({ name: 'Mock User' });
expect(foundUser).toBeDefined();
expect(foundUser.name).toBe('Mock User');
});
});
This approach works well for unit tests where database interaction is mocked. However, the lack of actual persistence may make your tests less reliable.
Mocking everything is an approach where you mock both the repository methods and any related services to simulate the behavior of the database without involving the actual ORM operations. See example an example in the nestjs-realworld-example-app here.
Pros:
Cons:
import { Test, TestingModule } from '@nestjs/testing';
import { UserService } from './user.service';
import { User } from './user.entity';
import { getRepositoryToken } from '@mikro-orm/nestjs';
describe('User Service - Full Mock', () => {
let userService: UserService;
const mockRepository = {
persistAndFlush: jest.fn(),
findOne: jest.fn(),
};
beforeEach(async () => {
const module: TestingModule = await Test.createTestingModule({
providers: [
UserService,
{ provide: getRepositoryToken(User), useValue: mockRepository },
],
}).compile();
userService = module.get<UserService>(UserService);
});
it('should create and return a user', async () => {
const mockUser = { id: 1, name: 'Mock User' };
mockRepository.persistAndFlush.mockResolvedValue(mockUser);
mockRepository.findOne.mockResolvedValue(mockUser);
const createdUser = await userService.create({ name: 'Mock User' });
const foundUser = await userService.findOne({ name: 'Mock User' });
expect(createdUser).toEqual(mockUser);
expect(foundUser).toEqual(mockUser);
});
});
This is particularly useful in unit tests where the focus is on testing business logic rather than database interaction.
Choosing the right testing strategy depends on the scope and type of your tests:
Consider mixing and matching these approaches based on the requirements of your project to balance accuracy, speed, and simplicity.
Currently, this blog fetches data from an external REST API. You can find more details here.
In my recent work , I focused on decoupling my components from the data source. My goal was to transition from code like this:
export default async function Home({
searchParams,
}: {
searchParams?: HomeSearchParams;
}) {
const posts = await fetch("https://rest-api-url.com/");
Here, we're making a fetch
call to an external REST API to retrieve post objects.
To something like this:
export default async function Home({
searchParams,
}: {
searchParams?: HomeSearchParams;
}) {
const posts = await activeDataProvider.getAll();
With these changes, we introduced a new layer between data-fetching operations and the component itself. I refer to this layer as the "data provider." I defined an interface specifying the required and optional methods for a data provider:
export interface IDataProvider {
getAll(options: PostSearchOptions): Promise<PaginatedPosts>;
getBySlug(slug: string): Promise<Post | null>;
create?(data: Partial<Post>): Promise<Post>;
update?(slug: string, data: Partial<Post>): Promise<Post | null>;
delete?(slug: string): Promise<boolean>;
}
This approach allows us to easily switch data sources in the future. For example, if we decide to fetch data directly from a database, we would simply create a new DbDataProvider
that implements IDataProvider
.
We would then only need to update the data-providers/active.ts
file to use the new DbDataProvider
:
import { DbAPIDataProvider } from './db';
const activeDataProvider = new DbAPIDataProvider();
export default activeDataProvider;
By modifying just one file (after creating the new data provider), you can change the app's persistence layer.
Another significant benefit of this approach is improved testability. Initially, I aimed to replace the active data provider with a TestDataProvider
that returns hard-coded data for unit tests. I planned to inject the active data provider as a dependency into Next.js page components like this:
export default async function Home({
dataProvider = activeDataProvider,
searchParams,
}: HomeProps) {
...
This setup allowed me to pass the test data provider as a parameter to the component:
<Suspense>
<Home searchParams={searchParams} dataProvider={testDataProvider} />
</Suspense>
While this worked well in development, I encountered errors when running next build
, such as:
Type error: Page "app/page.tsx" has an invalid "default" export:
Type "HomeProps" is not valid.
 ELIFECYCLE  Command failed with exit code 1.
Error: Command "pnpm run build" exited with 1
The issue was that Next.js components cannot accept parameters other than params
or searchParams
(source).
Since dependency injection was not possible, I ended up using spyOn
calls in my unit tests. Although I aimed to avoid mocks and spies, I couldn't find an alternative when dependency injection wasn't feasible.
Despite this, the testability of the code improved. For example, the test case initially looked like this:
import { getPostsAndTotalPages } from "../../app/lib/fetchPosts";
test("Home page component should match the snapshot", async () => {
const searchParams = {
query: "",
page: "1",
};
const getPostsAndTotalPagesMock = getPostsAndTotalPages as Mock;
getPostsAndTotalPagesMock.mockResolvedValue({
posts: generateMockPostAPIResponse().results,
totalPages: 2,
});
const { container } = render(
<Suspense>
<Home searchParams={searchParams} />
</Suspense>
);
// Access the screen first; otherwise, toMatchSnapshot will generate an empty snapshot
await screen.findByText("Post 1");
expect(container).toMatchSnapshot();
});
After the changes, it now looks like this:
const jsonData = JSON.parse(readFileSync('tests/test-data.json', 'utf-8'));
const memoryDataProvider = new MemoryDataProvider(jsonData);
test('Component should match the snapshot', async () => {
const postSlug = 'post-1';
const params = {
slug: postSlug,
};
vi.spyOn(
activeDataProvider,
'getSinglePostFromStorage',
).mockImplementation(() => memoryDataProvider.getSinglePostFromStorage(postSlug));
const { container } = render(
<Suspense>
<SinglePostPage params={params} />
</Suspense>,
);
// Access the screen first; otherwise, toMatchSnapshot will generate an empty snapshot
await screen.findByText('Post 1');
expect(container).toMatchSnapshot();
});
The revised test case is now less coupled to the implementation details of fetching post data. This makes the tests more robust and simplifies future code changes.
I hope some of this can also be helpful for you. Happy decoupling! 🚀