NestJS is one of the most popular Node.js frameworks that allows you to build reliable and scalable server-side applications. It has its own philosophy and provides architecture out of the box. NestJS build with TypeScript and is heavily inspired by Angular. Also has a really nice doc.

With the tutorial below you’ll be able to investigate this framework by building CRUD application with simple auth module and e2e tests. In addition you’ll use Prisma ORM for data modeling and migrations and Passport.js for authentication (almost sure that you’ve used it before)

Tutorial:
https://youtu.be/GHTA143_b-s

Official documentation:
https://docs.nestjs.com

For more useful info - subscribe Tech Read channel.
Likes and shares are welcome.

#nestjs #nodejs #typescript #prisma #e2e
Сontinuing the theme of NestJS, I want to share an article about Clean Architecture.

https://betterprogramming.pub/clean-node-js-architecture-with-nestjs-and-typescript-34b9398d790f

In the simple example you’ll see how to build layer-by-layer services to divide business logic and frameworks. This is critical if you want to create a scalable, well-testable system with the possibility to switch between different modules, databases etc.
Because (and it’s described in the epigraph of this article by Robert C. Martin) “Your architectures should tell readers about the system, not about the frameworks you used in your system”.
Hope you agree with him (I do).

For more useful info - subscribe Tech Read channel.
Likes and shares are welcome.

#nodejs #nestjs #cleanarchitecture
One of the best articles about Monorepo Written by Michael Solomon and Yoni Goldberg

Which Monorepo is right for a Node.js BACKEND now?

You will learn about Monorepo in general, what problems are solved with it. Also there is comparison between features and values of different tools for work with Monorepo - nx, Turborepo, Lerna. Also scenarios of synchronized and independent workflows.

For more useful info - subscribe Tech Read channel.
Likes and shares are welcome.

#nodejs #monorepo #nx #turborepo #lerna
Just a list of useful npm packages for Node.js.

Node.js Packages I Use In Every Project

For more info - subscribe to Tech Read channel.
Likes and shares are welcome.

#nodejs #npm
Two parts of one article about hashing algorithms in Node.js - What Is The Best Algorithm (Bcrypt, Scrypt, SHA512, Argon2) For Password Hashing In Node.js?

Links:
Part 1
Part 2

Not only common algorithms/npm modules (Bcrypt, Scrypt, SHA512, Argon2) are described by the author but pros/cons of them and benchmarks. So you can choose what to use in your Node.js application.

If you want to know more about Node.js, JavaScript, software architecture and much more -
subscribe to Tech Read channel.
Likes and shares of the post are welcome.

#nodejs #algorithms #hash
If you were really stressed about Node.js future - just read this article and relax.
Everything as usual is about standards (fortunately or not) and community.

Don’t worry, Nobody is Replacing Node, not Even Bun and Even less Deno

For more info - subscribe to Tech Read channel.
Likes, shares and recommendations are welcome.

#nodejs #deno #bun
New article from Yoni Goldberg which will help to look from the other side at such instruments like Dotenv, Morgan logger, environmental variables, Nest.js DI, Passport.js, Supertest etc.

Popular Node.js patterns and tools to re-consider

Highly recommend following the author on Medium.
And subscribe to my Tech Read channel where you’ll find a lot of interesting materials every week.
Likes, shares and recommendations are welcome.

#nodejs
In software development it is very important to keep the errors consistent.
You can check github repository that describes the way of handling Node.js apps errors:

Handle errors in a simple, stable, consistent way

For more useful info - subscribe to Tech Read channel.
Likes, shares and recommendations are welcome.

#nodejs #errors
During application development you usually need to run some tasks at specific times or with some intervals. This means that sooner or later you’ll face the need to choose the scheduler.

Main criterias here are: Support for Persistence, Prevention of Duplicate Job Execution and Scaling.

In the blog post What is a Job Scheduler you’ll find the most common Node.js modules for this - Agenda, Bull, Bree, Node-Cron.

Among them I recommend Bull (anyway you probably use Redis in your app and don’t want to install additional db) or using separate (micro)service.

To get more interesting information - subscribe to Tech Read channel.
Likes, shares and recommendations are welcome.

#nodejs #scheduler
Videos and animations become a more and more important part of web development. So the skill of rendering and automatic generation of this kind of content is essential for Node.js developers.

Good tutorial to start with is Video Rendering with Node.js and FFmpeg.

For more articles and tutorials subscribe to Tech Read channel.
Likes, shares and recommendations are welcome.

#nodejs #ffmpeg #video
Blue-green deployment is a release management technique that reduces risk and minimizes downtime. It uses two production environments, known as Blue and Green to provide reliable testing, continuous no-outage upgrades, and instant rollbacks.

If you for some reason don’t use Kubernetes you still are able to release this deployment strategy for the Node.js application with Nginx and pm2.

How? You’ll find instruction in article Blue Green Deployment for Node.js Without Kubernetes

And for more interesting articles - subscribe to Tech Read channel.
Likes, shares and recommendations are welcome.

#nodejs #nginx #pm2 #deployment
If you are a JavaScript developer - check your code again.
Are you sure that you handled all the “unhandled rejections”?

If not - check the post of Jake Archibald The gotcha of unhandled promise rejections

Spoiler: sometimes you need to write something like “preventUnhandledRejections” specifically if you use old versions of Node.js (please update) for example (<12.9.0).

Not to miss something useful - subscribe to Tech Read channel.
Likes, shares and recommendations are welcome.

#javascript #nodejs #promises
Just a List of Enterprise Integration Patterns with small description, schema and details.

Will be useful and interesting for architects and engineers.

Hope as useful and interesting a Tech Read channel.
Likes, shares and recommendations are welcome.

#patterns
This post is the selection of useful npm modules:

1. OTPAuth - One Time Password (HOTP/TOTP) library for Node.js, Deno, Bun and browsers.

HOTP (HMAC-based One-Time Password) and TOTP (Time-based One-Time Password) are two commonly used algorithms for generating one-time passwords (OTP) in two-factor authentication systems.
HOTP generates OTPs based on a counter value and a secret key using a hash-based message authentication code (HMAC) algorithm. Each time a new OTP is generated, the counter value is incremented.
TOTP, on the other hand, generates OTPs based on a combination of a secret key and the current time. A timestamp is used as the counter value, and the OTP changes every 30 seconds (default value).

2. Concurrent.js - Non-blocking Computation for JavaScript RTEs (Web Browsers, Node.js & Deno).

Non-blocking computation is a technique used to allow JavaScript runtime environments (RTEs) to perform computationally intensive tasks without blocking the main thread of execution. This is achieved by executing these tasks asynchronously, using features such as web workers or worker threads.
In JavaScript, blocking the main thread can lead to performance issues and a poor user experience, as the user interface may become unresponsive while the script is running. Non-blocking computation helps to mitigate this issue by allowing these tasks to be performed in the background, without affecting the responsiveness of the user interface.

3. Malibu - Framework-agnostic CSRF middleware for modern Node.js

CSRF (Cross-Site Request Forgery) is a type of web attack where an attacker tricks a user into performing an unintended action on a web application. The attack typically involves the attacker crafting a request to the application, and then tricking the user into submitting that request through some form of social engineering, such as by clicking on a malicious link or visiting a page with a hidden form.
One common example of a CSRF attack is when an attacker creates a malicious form on a website, which is designed to submit a request to a different website that the user is already logged in to. If the user is tricked into submitting the form, the attacker can execute a malicious action on the targeted website on behalf of the user.
To prevent CSRF attacks, developers can implement security measures such as using anti-CSRF tokens or implementing same-site cookies. These measures can help to ensure that a request is only processed if it originates from an authorized source, and can help to prevent unauthorized actions on the targeted website.

For more useful info - subscribe to Tech Read channel.
Likes, shares and recommendations are welcome.

#nodejs #deno #bun #npm #csrf
“JS (Browser + Node.js): Broadcast Channel API - How to Reduce Server Load”

In this article I described the use case of combining Broadcast Channel API and Shared Workers (or Worker Threads in Node.js) for reducing the number of server connections.

Broadcast Channel API is a JavaScript interface that allows communication between different browsing contexts, such as tabs or frames (or between Node.js Worker Threads, but about this at the end of the article), that share the same origin. It provides a simple publish-subscribe model for sending and receiving messages between these contexts.

Shared Worker - a special type of web worker that can be accessed by multiple instances of an application. Shared workers have a single thread of execution and a shared scope between all clients, allowing for efficient communication and synchronization between tabs.

To know how to combine them - check the article.

Medium link

If you liked the article and want to support the author:
Clap and follow me on Medium
Follow me on Linkedin
Subscribe to Tech Read channel

#javascript #nodejs #broadcastchannelapi #webapi #sharedworker
When Node.js is not enough

As a developer you should understand the importance of choosing the right programming language for a particular task. Node.js is a popular platform for building server-side applications using JavaScript, but sometimes we need to use other languages to take advantage of their unique features and capabilities.

Thankfully, Node.js provides several ways to integrate other programming languages into our applications. One popular approach is to use the Child Process module to spawn a child process and communicate with it through standard input/output streams. This allows us to execute external commands written in other languages and receive their output back in our Node.js application.

Another option is to use a native Node.js add-on written in another language, such as C++ or Rust, to access low-level system resources or improve performance-critical parts of our code. Node.js provides a well-documented API for building native add-ons, and there are also several community-driven tools like neon and node-ffi that simplify the process of writing and using native add-ons.

Lastly, we can also use WebAssembly, a low-level binary format designed to run in web browsers, to execute code written in other languages like C, Rust, or Go. Node.js supports WebAssembly, which allows us to load and execute WebAssembly modules directly in our Node.js applications.

Using other programming languages in Node.js can open up new possibilities and help us build more robust, performant, and scalable applications. However, it's important to choose the right approach based on our specific needs and requirements, and to carefully consider the trade-offs involved in integrating different programming languages.

To know more - subscribe to the Tech Read channel.
Also I’ll add a few links with articles about using Rust, Python and C languages with Node.js.
Likes, shares and recommendations are welcome.

#nodejs #webassembly

Links:
https://johns.codes/blog/exposing-a-rust-library-to-node-with-napirs
https://www.alxolr.com/articles/how-to-process-a-csv-file-five-times-faster-in-node-js-with-rust-and-napi-rs
https://github.com/bitair-org/linker.js
Node.js 20 performance

As a software engineer, it is crucial to recognize the significance of periodically checking the performance and versions of Node.js applications. Node.js has gained immense popularity due to its scalability and high-performance capabilities, making it a go-to choice for developing server-side applications. However, to ensure the continued success of your Node.js applications, it is essential to stay on top of their performance and keep up with the latest versions.

***
Before you continue reading - subscribe to the Tech Read channel in Telegram.
Likes, shares and recommendations are welcome.
***

Here are a few reasons why periodic performance checks and version updates are of utmost importance:

- Optimizing Performance: By regularly monitoring the performance of your Node.js applications, you can identify bottlenecks, inefficiencies, or areas that require optimization. Performance monitoring tools, such as profiling and benchmarking frameworks, can help you pinpoint specific areas that need improvement. This proactive approach allows you to fine-tune your application, enhancing its speed, responsiveness, and overall user experience.

- Security and Bug Fixes: New vulnerabilities and bugs are discovered regularly, and the Node.js community actively addresses them by releasing patches and updates. Keeping your application up to date with the latest Node.js version ensures that you have the most robust security measures and bug fixes in place. Neglecting updates may expose your application to potential security breaches or software glitches that can impact its stability and reliability.

- Compatibility with Dependencies: Node.js applications often rely on various external dependencies, such as libraries, frameworks, or plugins. These dependencies also receive updates over time, introducing new features, bug fixes, or improved performance. By periodically checking the compatibility of your Node.js application with its dependencies, you can avoid conflicts, ensure smooth integration, and take advantage of the latest enhancements available.

- Community Support and Knowledge Sharing: Node.js benefits from a vast and active community of developers who constantly contribute to its growth and improvement. By staying updated with the latest versions and actively participating in the community, you gain access to valuable resources, best practices, and collaborative discussions. This engagement can help you overcome challenges more efficiently, discover new techniques, and remain at the forefront of Node.js development.

To conclude, as a responsible software engineer, it is crucial to perform periodic checks on the performance and versions of your Node.js applications. By doing so, you can optimize performance, ensure security and bug fixes, maintain compatibility with dependencies, and stay engaged with the vibrant Node.js community. Embracing these practices will not only help you deliver high-quality applications but also provide a solid foundation for future scalability and success.

PS. Link to the article ”State of Node.js Performance 2023” (specifically such modules as fs, events, http, misc, module, streams, url, buffers, utils) below.

#nodejs #performance

Links:
https://blog.rafaelgss.dev/state-of-nodejs-performance-2023
Node.js: URL parsing

The release of Ada URL Parser v2.0, the latest version of the powerful URL parsing tool, is announced with great excitement. Following closely after the release of v1.0.4, this update introduces notable enhancements such as improved performance, reduced memory usage, and exciting new features. In the upcoming blog post, the advancements of Ada URL Parser v2.0 will be explored, highlighting the benefits it brings to developers in their daily tasks.

***
Before you continue reading - subscribe to the Tech Read channel in Telegram.
Likes, shares and recommendations are welcome.
***

Improved Performance and Memory Usage:
One of the standout improvements in Ada URL Parser v2.0 is its enhanced performance. In some cases, the execution speed has doubled, allowing for faster parsing of URLs. This boost in performance can greatly benefit applications that heavily rely on URL parsing operations. Additionally, the update includes optimizations that result in reduced memory usage and allocations. Developers can now handle URL parsing tasks more efficiently, enabling better resource management within their applications.

Introducing a New Feature:
Ada URL Parser v2.0 also introduces a compelling new feature that will be particularly valuable for developers working with one-time URL parsing tasks. This feature enhances the tool's versatility and empowers developers to efficiently handle specific URL parsing requirements. Whether it's extracting specific parameters or performing advanced parsing operations, this new addition expands the capabilities of Ada URL Parser and simplifies URL manipulation tasks.

Benefitting Developers in their Everyday Work:
The improvements in Ada URL Parser v2.0 provide developers with a more efficient and reliable tool for working with URLs. With enhanced performance and reduced memory usage, developers can expect faster and more optimized URL parsing operations. The new feature adds versatility and flexibility to address specific parsing needs, further streamlining the development process.

Conclusion:
Ada URL Parser v2.0 brings significant enhancements, including improved performance, reduced memory usage, and a new feature to tackle one-time URL parsing tasks. These updates make Ada URL Parser a powerful and reliable tool for developers working with URLs. By boosting performance and introducing new capabilities, Ada URL Parser v2.0 empowers developers to handle a broader range of URL parsing tasks with ease. Upgrade to the latest version and experience the enhanced capabilities of Ada URL Parser in your everyday work.

PS. Link to the article ”Reducing the cost of string serialization in Node.js core” below.

#nodejs #url #serialization

Links:
https://www.yagiz.co/reducing-the-cost-of-string-serialization-in-nodejs-core