Node js прочитать файл json
When you want to store data between server restarts with Node, JSON files are a simple and convenient choice. Whether you are reading a config file or persisting data for your application, Node has some built in utilities that make it easy to read and write JSON files. Using JSON files in your app can be a useful way to persist data. We will look at a few different methods for working with JSON files.
In this tutorial we'll:
- Read JSON data from disk
- Learn to use fs module to interact with the filesystem
- Persist data to a JSON file
- Use JSON.parse and JSON.stringify to convert data to and from JSON format
By the end of this tutorial you should be able to work with JSON files using Node’s built-in fs module.
Say you have a customer.json file saved to disk that holds a record for a customer in your store.
As part of your store app, you want to access the customer’s address, and then update the order count after an order is placed.
In this tutorial, we are going to look at how to read and write to our customer.json file.
Serializing and deserializing JSON
Serialization is the process of modifying an object or data structure to a format that is easy to store or transfer over the internet. You can recover the serialized data by applying the reverse process.
Deserialization refers to transforming the serialized data structure to its original format.
You will almost always need to serialize JSON or JavaScript object to a JSON string in Node. You can do so with the JSON.stringify method before writing it to a storage device or transmitting it over the internet:
On the other hand, after reading the JSON file, you will need to deserialize the JSON string to a plain JavaScript object using the JSON.parse method before accessing or manipulating the data:
JSON.stringify and JSON.parse are globally available methods in Node. You don’t need to install or require before using.
A note about file encoding
Both fs.readFileSync and fs.readFile take an optional encoding argument. If you specify a character encoding you'll get a string in return. If you do not specify a character encoding both functions will return a Buffer .
This is because Node does not, and cannot, assume what kind of content a file contains. Even if you can. In order to handle this lack of definition, Node will read the file byte for byte and return it as an un-opinionated buffer which you can process as desired.
If you do know the content of the file, and can provide that detail to Node in the form of an encoding argument it generally makes the code both more performant and easier to understand.
Read a JSON file
The simplest way to read a JSON file is to require it. Passing require() with the path to a JSON file will synchronously read and parse the data into a JavaScript object.
But reading JSON files with require has its downsides. The file will only be read once; requiring it again returns the cached data from the first time require was run. This is fine for loading static data on startup (like config data). But for reading a file that changes on disk, like our customer.json might, we need to manually read the file using the asynchronous fs.readFile .
Read a file with fs.readFile
To access the customer’s address, we need to:
- Read the JSON data from the file
- Parse the JSON string into a JavaScript object
To load the data from customer.json file, we will use fs.readFile , passing it the path to our file, an optional encoding type, and a callback to receive the file data.
If the file is successfully read, the contents will be passed to the callback.
- ./customer.json is the relative path to the the file utf8 is an optional parameter for the encoding of the file we are reading, this can be left out. If not specified the function will return a Buffer instead of a string .
- (err, jsonString) =><> is the callback function that runs after the file has been read.
Now we have the contents of the file as a JSON string, but we need to turn the string into an object.
Before we can use the data from the callback in our code, we must turn it into an object. JSON.parse takes JSON data as input and returns a new JavaScript object. Otherwise, we would just have a string of data with properties we can’t access.
JSON.parse can throw exception errors and crash our program if passed an invalid JSON string. To prevent crashing we wrap JSON.parse in a try catch statement to gracefully catch any errors.
This example shows reading and parsing a JSON file:
Using the jsonString from reading customer.json, we create an object, and can access the address property. If JSON.parse throws an error, we handle it in the catch block.
Now we have an object representation of the data in our customer.json file!
We can also read the file synchronously using fs.readFileSync . Instead of taking a callback, readFileSync returns the file content after reading the file.
We can use this knowledge to create a reusable helper function to read and parse a JSON file. Here we create a function called jsonReader that will read and parse a JSON file for us. It takes the path to the file and a callback to receive the parsed object and any errors. It will catch any errors thrown by JSON.parse for us.
200’s only
Monitor failed and slow network requests in production
LogRocket is like a DVR for web and mobile apps, recording literally everything that happens while a user interacts with your app. Instead of guessing why problems happen, you can aggregate and report on problematic network requests to quickly understand the root cause.
LogRocket instruments your app to record baseline performance timings such as page load time, time to first byte, slow network requests, and also logs Redux, NgRx, and Vuex actions/state. Start monitoring for free.
How to use the bfj npm package for reading and writing JSON files
bfj is another npm package you can use for handling data in JSON format. According to the documentation, it was created for managing large JSON datasets.
bfj implements asynchronous functions and uses pre-allocated fixed-length arrays to try and alleviate issues associated with parsing and stringifying large JSON or JavaScript datasets – bfj documentation
You can read JSON data using the read method. The read method is asynchronous and it returns a promise.
Assuming you have a config.json file, you can use the following code to read it:
Similarly, you can use the the write method to write data to a JSON file:
bfj has lots of functions that you can read about in the documentation. It was created purposely for handling large JSON data. It is also slow, so you should use it only if you are handling relatively large JSON datasets.
Write to a file with fs.writeFile
Writing JSON to the filesystem is similar to reading it. We will use fs.writeFile to asynchronously write data to a newCustomer.json file.
First, to write data to a JSON file, we must create a JSON string of the data with JSON.stringify . This returns a JSON string representation of a JavaScript object, which can be written to a file. Similar to parsing data into an object when reading a file, we must turn our data into a string to be able to write it to a file.
Create a customer object with our data below, and turn it into a string.
Note: If you try to write an object to a file without stringifying it, your file will be empty and look like this:
Once the data is stringified, we can use fs.writeFile to create a new customer file. We pass fs.writeFile the filepath, our customer data to write, and a callback that will be excecuted after the file is written. If the newCustomer.json file doesn’t already exist, it will be created; if it does exist, it will be overwritten!
Promise-based API
The promise-based API is asynchronous, like the callback API. It returns a promise, which you can manage via promise chaining or async-await.
You can access the promise-based API by requiring fs/promises :
We used the commonJS syntax for accessing the modules in the code snippets above. We shall be using the commonJS syntax throughout this article because Node treats JavaScript code as a commonJS module by default. You can also use ES6 modules if you want.
According to the Node documentation, the callback API of the built-in fs module is more performant than the promise-based API. Therefore, most examples in this article will use the callback API.
Callback API
Unlike the synchronous methods that block the event loop, the corresponding methods of the callback API are asynchronous. You’ll pass a callback function to the method as the last argument.
The callback function is invoked with an Error object as the first argument if an error occurs. The remainder of the arguments to the callback function depends upon the fs method.
You can also access the methods of the callback API by requiring fs like the synchronous API:
Watch: Read/Write JSON Files
Чтение файла JSON
Давайте сначала посмотрим, как мы можем прочитать уже созданный файл. Но прежде чем мы это сделаем, нам нужно создать файл. Откройте новое окно в вашем любимом текстовом редакторе и добавьте в него следующий текст:
Теперь сохраните этот файл как «student.json» в каталоге вашего проекта.
Чтобы прочитать данные JSON из файла, мы можем использовать модуль Node.js fs. В этом модуле доступны две функции, которые мы можем использовать для чтения файлов из файловой системы: readFile и readFileSync .
Хотя обе эти функции выполняют схожие задачи, например, читают файлы с диска, разница заключается в том, как эти функции выполняются на самом деле, что мы объясним более подробно в следующих разделах.
Использование fs.readFileSync
Функция readFileSync считывает данные из файла. Эта функция блокирует выполнение остальной части кода до тех пор, пока все данные не будут прочитаны из файла. Эта функция особенно полезна, когда вашему приложению необходимо загрузить параметры конфигурации, прежде чем оно сможет выполнять какие-либо другие задачи.
Чтобы продолжить наш пример, давайте воспользуемся этой функцией для чтения файла "student.json", который мы создали ранее, с readFileSync функции readFileSync. Добавьте следующий код в файл .js:
В приведенном выше коде Node.js мы сначала загружаем fs в наше приложение. Затем мы используем readFileSync и передаем ей относительный путь к файлу, который мы хотим прочитать. Если напечатать объект rawdata на консоль, вы увидите исходные данные (в буфере ) на экране консоли:
Однако мы хотим читать файл в его формате JSON, а не необработанные шестнадцатеричные данные. Здесь в JSON.parse вступает функция JSON.parse. Эта функция обрабатывает синтаксический анализ необработанных данных, преобразует их в текст ASCII и анализирует фактические данные JSON в объект JavaScript. Теперь, если вы распечатаете student на консоли, вы получите следующий вывод:
Как видите, JSON из нашего файла был успешно загружен в объект student
Использование fs.readFile
Другой способ чтения файла JSON в Node.js - использование функции readFile В отличие от функции readFileSync readFile считывает данные файла асинхронно. Когда readFile функция readFile, начинается процесс чтения файла, и сразу же управление переходит к следующей строке, выполняющей оставшиеся строки кода. После загрузки данных файла эта функция вызовет предоставленную ей функцию обратного вызова. Таким образом, вы не блокируете выполнение кода, ожидая, пока операционная система вернется к вам с данными.
В нашем примере readFile принимает два параметра: путь к файлу, который должен быть прочитан, и функцию обратного вызова, которая должна быть вызвана, когда файл будет полностью прочитан. При желании вы также можете включить параметр с параметрами, но мы не будем рассматривать их в этой статье.
Взгляните на следующий пример, чтобы понять, как использовать функцию readFile
Приведенный выше код делает то же самое, что и наш предыдущий фрагмент кода (с дополнительным console.log ), но делает это асинхронно. Вот несколько отличий, которые вы, возможно, заметили:
- (err, data) =><> : это наша функция обратного вызова, которая выполняется после полного чтения файла.
- err : Так как мы не можем легко использовать попробовать / поймать с асинхронным кодом, функция вместо дает нам err объект , если что - то пойдет не так. null если ошибок не было
Вы также могли заметить, что мы readFile строку в консоль сразу после вызова readFile. Это сделано для того, чтобы показать вам поведение асинхронного кода. Когда приведенный выше сценарий будет выполнен, вы увидите, что этот console.log выполняется до выполнения readFile обратного вызова readFile. Это связано с тем, что readFile не блокирует выполнение кода при чтении данных из файловой системы.
Вывод кода будет выглядеть так:
Как видите, последняя строка кода в нашем файле на самом деле является той, которая появляется первой в выводе.
Использование require
Другой подход - использовать глобальный require для чтения и анализа файлов JSON. Это тот же метод, который вы используете для загрузки модулей Node, но его также можно использовать для загрузки JSON.
Взгляните на следующий пример.
Он работает точно так же, как readFileSync мы показали выше, но это глобально доступный метод, который вы можете использовать где угодно, что имеет свои преимущества.
Однако у функции require есть несколько недостатков:
- Require - это синхронная функция, которая вызывается только один раз, что означает, что вызовы получают кешированный результат. Если файл обновлен, вы не можете перечитать его с помощью этого метода.
- Ваш файл должен иметь расширение .json, поэтому он не может быть таким гибким. Без правильного расширения require не обрабатывает файл как файл JSON.
Synchronous version
If the JSON content is streamed over the network, you need to use a streaming JSON parser. Otherwise it will tie up your processor and choke your event loop until JSON content is fully streamed.
There are plenty of packages available in NPM for this. Choose what's best for you.
If you are unsure if whatever that is passed to JSON.parse() is valid JSON, make sure to enclose the call to JSON.parse() inside a try/catch block. A user provided JSON string could crash your application, and could even lead to security holes. Make sure error handling is done if you parse externally-provided JSON.
@natario: We are talking about server-side JS here. Suppose someone is parsing user-supplied JSON. If the assumption is that the JSON is always well formed, an attacker can send some malformed JSON to trigger an error, which if spilled to the client side, may reveal vital information about the system. Or if the JSON was both malformed and contained some text with
@NickSteele: However, I changed "this is not recommended" to "I do not recommend". I hope you are happy now.
@NickSteele: Given the flaws I have listed I don't think it's a well designed feature. Looks to me like some people thought "hey, wouldn't it be cool to use require to include JSON?" and didn't even bother documenting the side effects. This also meant that require accepts files in two languages: JavaScript, and JSON (no they're not the same). So much for SRP.
This answer has 50 upvotes. According to the 1% rule, probably 5000 users have spent time reading this answer, which adds nothing to the top one. The fact that it's 3 years old only makes the problem worse :)
@DanDascalescu -- If you'll notice, the two answers were posted at exactly the same time 3 years ago. They both provide the same information. This is the case all over SO, I'm not about to go culling half of my answers just because they weren't the accepted answer.
I for one found this series of comments fairly interesting but the answer itself to be a waste of my time. . I'm not sure if that implies that the answer should be deleted, because then I wouldn't have seen the comment thread. But otherwise I'd say yeah.
@DanDascalescu, I believe this answer is clearer and straight to the point. The accepted one doesn't give a usage example and is confusing because of many links and extra stuff.
Another example of JSON.parse :
I'd like to mention that there are alternatives to the global JSON object. JSON.parse and JSON.stringify are both synchronous, so if you want to deal with big objects you might want to check out some of the asynchronous JSON modules.
This is especially true if one expects JSON data from incoming connections. If malformed JSON is being parsed by JSON.parse your whole application is going to crash or, using process.on('uncaughtException', function(err) < . >); , there will eventually be no chance to send a "malformed JSON" error to the user.
Include the node-fs library.
It might be worth noting that you should wrap your var file line in a try/catch just in case your JSON fails to parse or the file does not exist.
Since you don't know that your string is actually valid, I would put it first into a try catch. Also since try catch blocks are not optimized by node, i would put the entire thing into another function:
OR in "async style"
I just want to make a note that process.nextTick is not aysnc. It's just putting off reading the file until the next function call in the JS event loop. To run JSON.parse asynchronously you have use a different thread than the main Node.js thread
Parsing a JSON stream? Use JSONStream .
Everybody here has told about JSON.parse, so I thought of saying something else. There is a great module Connect with many middleware to make development of apps easier and better. One of the middleware is bodyParser. It parses JSON, html-forms and etc. There is also a specific middleware for JSON parsing only noop.
Take a look at the links above, it might be really helpful to you.
as other answers here have mentioned, you probably want to either require a local json file that you know is safe and present, like a configuration file:
or to use the global JSON object to parse a string value into an object:
note that when you require a file the content of that file is evaluated, which introduces a security risk in case it's not a json file but a js file.
you can modify the code and see the impact.
Using JSON for your configuration with Node.js? Read this and get your configuration skills over 9000.
Note: People claiming that data = require('./data.json'); is a security risk and downvoting people's answers with zealous zeal: You're exactly and completely wrong. Try placing non-JSON in that file. Node will give you an error, exactly like it would if you did the same thing with the much slower and harder to code manual file read and then subsequent JSON.parse(). Please stop spreading misinformation; you're hurting the world, not helping. Node was designed to allow this; it is not a security risk!
Proper applications come in 3+ layers of configuration:
- Server/Container config
- Application config
- (optional) Tenant/Community/Organization config
- User config
Most developers treat their server and app config as if it can change. It can't. You can layer changes from higher layers on top of each other, but you're modifying base requirements. Some things need to exist! Make your config act like it's immutable, because some of it basically is, just like your source code.
Failing to see that lots of your stuff isn't going to change after startup leads to anti-patterns like littering your config loading with try/catch blocks, and pretending you can continue without your properly setup application. You can't. If you can, that belongs in the community/user config layer, not the server/app config layer. You're just doing it wrong. The optional stuff should be layered on top when the application finishes it's bootstrap.
Stop banging your head against the wall: Your config should be ultra simple.
Take a look at how easy it is to setup something as complex as a protocol-agnostic and datasource-agnostic service framework using a simple json config file and simple app.js file.
container-config.js.
index.js. (the engine that powers everything)
app.js. (the code that powers your protocol-agnostic and data-source agnostic service)
Using this pattern, you can now load community and user config stuff on top of your booted app, dev ops is ready to shove your work into a container and scale it. You're read for multitenant. Userland is isolated. You can now separate the concerns of which service protocol you're using, which database type you're using, and just focus on writing good code.
Because you're using layers, you can rely on a single source of truth for everything, at any time (the layered config object), and avoid error checks at every step, worrying about "oh crap, how am I going to make this work without proper config. ".
JSON (JavaScript Object Notation) is a popular format for sharing data among applications written in different languages. In Node.js applications, JSON has become a convenient choice for storing data thanks to its uniformity and simplicity.
Node.js provides some built-in modules that make it easy to work with JSON data. In this article, you'll learn to:
- Read JSON files from the disk
- Write JSON data to a file
- Use the fs module to interact with the filesystem
- Use built-in methods like JSON.parse() and JSON.stringify() to convert data from and to JSON format
- Use the global require() method to load a JSON file at startup
Before I go into details of reading a JSON file, let us first create a new JSON file called databases.json that holds the following JSON data:
The databases.json is a simple file stored on disk that contains a JSON array of objects. Our goal is to read this file and print the records on the console.
To read the JSON data from the above file, you can use the native fs module. This module provides methods to read, write, and watch files along with many other functions to interact with the filesystem. Since it is a native module, you don't need to install anything. Just import it in your code by calling const fs = require('fs') .
The fs module gives us two methods, fs.readFile() and fs.readFileSync() , that can be used to read data from a file. Both these functions do the same thing — reading files from disk. The only difference lies in the way these functions are actually executed.
The fs.readFile() method reads data from a file asynchronously. It doesn't block the execution of the event loop while reading the file. Instead, the control is shifted to the next line to execute the remaining lines of code. Once the file data becomes available, fs.readFile() invokes the callback function passed to it as an argument.
To read the JSON data from the databases.json file by using the fs.readFile() method, just pass in the name of the file, an optional encoding type, and a callback function to receive the file data:
Int the above example, since the fs.readFile() method returns data as a JSON string, we have to use JSON.parse() to parse it to a JSON object. Finally, we use the forEach() loop to print all databases on the console.
Here is the output of the above code:
The fs.readFileSync() method reads data from a file in a synchronous manner. Unlike fs.readFile() , it blocks the execution of the event loop until all the data from the file is loaded.
Instead of passing the callback method, you only pass the name of the file to fs.readFileSync() as shown below:
Although the fs.readFileSync() has a clean syntax, you should never use to read large files as it blocks the execution of the event loop and can drastically impact the performance of the application. It is useful only for reading configuration files on application start before performing any other tasks.
Finally, the last way of reading a JSON file is by using the global require() method. This approach is similar to what you use for loading Node.js modules, but it also works for loading JSON files.
All you need to do is pass the JSON file path to the require() method, and it will synchronously read and parse the JSON file, and return a JSON object ready to be used:
The require() method works exactly like the fs.readFileSync() — read file synchronously, but it is a global method that can be called from anywhere. Moreover, it automatically parses the file content into a JavaScript object.
However, there are a few downsides of using the require() method:
- It only reads the file once and cache data; requiring it again with simply return the cached data.
- The file must have the .json extension. Without the proper extension, the require() method won't treat it as a JSON file.
Because of the above limitations, require() is only suitable for loading static configuration files that don't change often. For reading a dynamic file like databases.json , you should use the fs.readFile() method instead.
Just like fs.readFile() and fs.readFileSync() method, the fs module provides two more functions for writing data files: fs.writeFile() and fs.writeFileSync() .
As the names suggest, the fs.writeFileSync() method writes data to a file synchronously while fs.writeFile() writes data to a file in an asynchronous manner.
To write JSON to a file by using fs.writeFile() , just pass in the path of the file to write data to, the JSON string that you want write, an optional encoding type, and a callback function that will be executed after the file is written.
Note that if the file doesn't already exist, it will be created; if it does exist, it will be overwritten!
Here is an example:
In the above example, we are storing the JSON object user to the user.json file.
Notice the use of the JSON.stringify() method to convert the JSON object into a JSON string before saving it to disk. If you try to write an object to a file without first stringifying it, your file will be empty and look like below:
Now if you execute the above code, you should see the following content in the user.json file:
Technically, that's all you need to write JSON to a file. However, the data is written as a single line of string in the file.
To pretty-print the JSON object, change the JSON.stringify() method as follows:
Now, if you open the user.json file, you should see the following content:
Finally, the last way to write data to a JSON file is by using the fs.writeFileSync() method. It writes data to a file synchronously which means it blocks the execution of the Node.js event loop until the file is written to disk.
Take a look at the following example that uses fs.writeFileSync() to write a JSON object to a file:
Now that we have learned how to read and write JSON files, what if you want to update an existing JSON file?
We can combine both these approaches to use our JSON files as a simple database. Whenever we want to update the data in the JSON file, we can read the contents, change the data, and then write the new data back to the original file.
Here is an example that demonstrates how you can add another record to the databases.json file:
Now, if you execute the above code, you should see a new entry in databases.json as shown below:
If you don't want to manually parse or stringify JSON data each time you read or write to a JSON file, use the jsonfile module instead.
The jsonfile module wraps the fs module and JSON object methods and exposes the same methods as the fs module for reading and writing JSON files.
Type the following command in your project root directory to install the jsonfile module:
To read data from JSON files, the jsonfile module provides readFile() and readFileSync() methods. They are similar to those offered by the fs module except that they automatically parse the contents of the file into a JSON object:
Similarly, to write data to a JSON file, you can either use the writeFile() or writeFileSync() method:
JSON is one of the most popular types of data that you expect to work in Node.js, and being able to read and write JSON files is extremely useful.
In this article, we have looked at different ways to read and write JSON files, including the fs module, the require() method, and the jsonfile module — a 3rd-party module.
The fs module is a native module that provides functions for both reading and writing files. The fs.readFile() and fs.writeFile() methods can be used to read and write data to JSON files asynchronously. To synchronously interact with the filesystem, there are fs.readFileSync() and fs.writeFileSync() methods available.
You can also use the global require() method to synchronously read and parse a JSON file at startup. However, it only caches the file data and can only be used to read files with the .json extension.
If you want to learn more, take a look at what JSON actually is, and how you can read and write a JSON object to a file in Node.js.
✌️ Like this article? Follow me on Twitter and LinkedIn. You can also subscribe to RSS Feed.
Один из лучших способов обмена информацией между приложениями, написанными на разных языках, - это использование формата JSON (JavaScript Object Notation). Благодаря единообразию и простоте JSON почти полностью заменил XML в качестве стандартного формата обмена данными в программном обеспечении, особенно в веб-сервисах.
Учитывая широкое использование JSON в программных приложениях, и особенно в приложениях на основе JavaScript, важно знать, как читать и записывать данные JSON в файл в Node.js. В этой статье мы объясним, как выполнять эти функции.
We made a custom demo for .
No really. Click here to check it out .
Each function exposed by the fs module has the synchronous, callback, and promise-based form. The synchronous and callback variants of a method are accessible from the synchronous and callback API. The promise-based variant of a function is accessible from the promise-based API.
How to read a JSON file using the fs.readFile method
You can use the readFile method to read JSON files. It asynchronously reads the contents of the entire file in memory, therefore it is not the most optimal method for reading large JSON files.
The first argument, path , is the file name or the file descriptor. The second is an optional object argument, and the third is a callback function. You can also pass a string as the second argument instead of an object. If you pass a string, then it has to be encoded.
The callback function takes two arguments. The first argument is the error object if an error occurs, and the second is the serialized JSON data.
The code snippet below will read the JSON data in the config.json file and log it on the terminal:
Make sure to deserialize the JSON string passed to the callback function before you start working with the resulting JavaScript object.
How to write to JSON files using the fs.writeFileSync method
Unlike writeFile , writeFileSync writes to a file synchronously. If you use writeFileSync , you will block the execution of the event loop and the rest of the code until the operation is successful or an error occurs. It will create a new file if the path you pass doesn’t exist, and overwrites it if it does.
In the code snippet below, we are writing to the config.json file. We are wrapping the code in try-catch so that we can catch any errors:
Introduction to the fs module
Because the fs module is built in, you don’t need to install it. It provides functions that you can use to read and write data in JSON format, and much more.
Asynchronous version
How to use the jsonfile npm package for reading and writing JSON files
jsonfile is a popular npm package for reading and writing JSON files in Node. You can install it using the command below:
It is similar to the readFile and writeFile methods of the fs module, though jsonfile has some advantages over the built-in methods.
Some of the features of this package are as follows:
- It serializes and deserializes JSON out of the box
- It has a built-in utility for appending data to a JSON file
- Supports promise chaining
You can see the jsonfile package in action in the code snippet below:
You can also use promise chaining instead of passing a callback function like in the above example:
How to read a JSON file using fs.readFileSync method
readFileSync is another built-in method for reading files in Node similar to readFile . The difference between the two is that readFile reads the file asynchronously while readFileSync reads the file synchronously. Therefore, readFileSync blocks the event loop and execution of the remaining code until all the data has been read.
To grasp the difference between synchronous and asynchronous code, you can read the article “Understanding asynchronous JavaScript” here.
Below is the function signature of fs.readFileSync :
path is the path to the JSON file you want to read, and you can pass an object as the second argument. The second argument is optional.
In the code snippet below, we are reading JSON data from the config.json file using readFileSync :
Conclusion
As explained in the above sections, JSON is one of the most popular formats for data exchange over the internet.
The Node runtime environment has the built-in fs module you can use to work with files in general. The fs module has methods that you can use to read and write to JSON files using the callback API, promise-based API, or synchronous API.
Because methods of the callback API are more performant than that of the promise-based API, as highlighted in the documentation, you are better off using the callback API.
In addition to the built-in fs module, several popular third-party packages such as jsonfile , fs-extra , and bfj exist. They have additional utility functions that make working with JSON files a breeze. On the flip side, you should evaluate the limitations of adding third-party packages to your application.
How to use the fs-extra npm package for reading and writing JSON files
fs-extra is another popular Node package you can use to work with files. Though you can use this package for managing JSON files, it has methods whose functions extend beyond just reading and writing JSON files.
As its name suggests, fs-extra has all the functionalities provided by the fs module and more. According to the documentation, you can use the fs-extra package instead of the fs module.
You need to first install fs-extra from npm before using it:
The code below shows how you can read JSON files using the readJson method of the fs-extra package. You can use a callback function, promise chaining, or async/await :
The code below illustrates how you can write JSON data using the writeJson method:
Just like the fs module, fs-extra has both asynchronous and synchronous methods. You don’t need to stringify your JavaScript object before writing to a JSON file.
Similarly, you don’t need to parse to a JavaScript object after reading a JSON file. The module does it for you out of the box.
Учить больше
Хотите узнать больше об основах Node.js? Лично я бы порекомендовал пройти онлайн-курс, например Learn Node.js от Веса Боса . Вы не только изучите самый современный синтаксис ES2017, но и сможете создать полноценное ресторанное приложение. По моему опыту, создание подобных реальных приложений - самый быстрый способ учиться.
How to load a JSON file using the global require function
You can use the global require function to synchronously load JSON files in Node. After loading a file using require , it is cached. Therefore, loading the file again using require will load the cached version. In a server environment, the file will be loaded again in the next server restart.
It is therefore advisable to use require for loading static JSON files such as configuration files that do not change often. Do not use require if the JSON file you load keeps changing, because it will cache the loaded file and use the cached version if you require the same file again. Your latest changes will not be reflected.
Assuming you have a config.json file with the following content:
You can load the config.json file in a JavaScript file using the code below. require will always load the JSON data as a JavaScript object:
Recap
JSON is one of the most common types of data you’ll work with in Node, and being able to read and write JSON files is very useful. You’ve learned how to use fs.readFile and fs.writeFile to asynchronously work with the filesystem, as well as how to parse data to and from JSON format, and catch errors from JSON.parse .
You can use require to read a JSON file at startup to synchronously parse a JSON file in one line. And now you can use a simple JSON file as a data store.
If you want to learn more, you can read up on what JSON actually is, and find out more about synchronous vs asynchronous code.
Synchronous version
But, I do not recommend this for several reasons:
- require is synchronous. If you have a very big JSON file, it will choke your event loop. You really need to use JSON.parse with fs.readFile .
- require will read the file only once. Subsequent calls to require for the same file will return a cached copy. Not a good idea if you want to read a .json file that is continuously updated. You could use a hack. But at this point, it's easier to simply use fs .
- If your file does not have a .json extension, require will not treat the contents of the file as JSON.
Seriously! Use JSON.parse .
If you are reading large number of .json files, (and if you are extremely lazy), it becomes annoying to write boilerplate code every time. You can save some characters by using the load-json-file module.
Asynchronous version
Interact with files with fs
Accessing files in Node is done with the native module fs, which gives you functions to watch, read, and write files along with many other tools to work with the filesystem. Because it’s a native module, we can require it in our code without installing it. Just call const fs = require(‘fs’) .
The fs module gives us the option of synchronous or asynchronous versions of many of its functions. The synchronous versions block execution of other code until they are done accessing the filesystem, reading, or writing data. An async function will run without blocking other code. Learn more about sync/async behavior.
This synchronous behavior can be useful in some places, like at startup when reading a config file before any other code is run, but becomes a big issue when used in a webserver where all incoming requests would be blocked while a synchronous file read is running. For this reason, you generally want to use the async versions of fs functions in your code. We will focus on async operations, but will also show the synchronous equivalent.
To read and write files asynchronously with fs we will use fs.readFile and fs.writeFile .
We also will use the global JSON helper to convert objects to JSON strings, and JSON strings to objects.
How to read JSON files in Node.js
The Node runtime environment has the built-in require function and the fs module that you can use for loading or reading JSON files. Because require is globally available, you don’t need to require it.
However, you will need to require the fs module before using it. I will discuss how to read JSON files using the built-in fs module and require function in the following subsections.
Additional resources
Sign in with your Osio Labs account
to gain instant access to our entire library.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago .
How should I parse JSON using Node.js? Is there some module which will validate and parse JSON securely?
Заключение
В этой статье мы показали, как вы можете читать и записывать данные JSON из файлов и в файлы, что является очень распространенной и важной задачей, которую нужно знать, как веб-программист.
fs есть несколько методов как для чтения, так и для записи в файлы JSON. Функции readFile и readFileSync будут читать данные JSON из файла асинхронно и синхронно соответственно. Вы также можете использовать глобальный require для обработки чтения / анализа данных JSON из файла в одной строке кода. Однако require является синхронным и может читать данные JSON только из файлов с расширением .json.
Точно так же функции writeFile и writeFileSync из fs записывают данные JSON в файл асинхронным и синхронным образом соответственно.
JavaScript Object Notation, referred to as JSON in short, is one of the most popular formats for data storage and data interchange over the internet. The simplicity of the JSON syntax makes it very easy for humans and machines to read and write.
Despite its name, the use of the JSON data format is not limited to JavaScript. Most programming languages implement data structures that you can easily convert to JSON and vice versa.
JavaScript, and therefore the Node.js runtime environment, is no exception. More often than not, this JSON data needs to be read from or written to a file for persistence. The Node runtime environment has the built-in fs module specifically for working with files.
This article is a comprehensive guide on how to use the built-in fs module to read and write data in JSON format. We shall also look at some third party npm packages that simplify working with data in the JSON format.
Further your understanding
- Learn about using Node.js Streams to read really large files.
- Walk through this example with a co-worker. Are all the concepts clear to you? Do you need to review anything?
How to write to JSON files using the fs.writeFile method
JSON.stringify will format your JSON data in a single line if you do not pass the optional formatting argument to the JSON.stringify method specifying how to format your JSON data.
If the path you pass to the writeFile method is for an existing JSON file, the method will overwrite the data in the specified file. It will create a new file if the file does not exist:
Запись JSON в файл
Подобно readFile и readFileSync , есть две функции для записи данных в файлы : writeFile и writeFileSync . Как следует из названий, writeFile записывает данные в файл асинхронным способом, в то время writeFileSync функция writeFileSync записывает данные в файл синхронным образом.
Мы рассмотрим подробнее в следующих разделах.
Использование fs.writeFileSync
Функция writeFileSync принимает 2-3 параметра: путь к файлу для записи данных, данные для записи и необязательный параметр.
Обратите внимание: если файл еще не существует, для вас создается новый файл. Взгляните на следующий пример:
В приведенном выше примере мы сохраняем наш объект JSON student в файл с именем «student-2.json». Обратите внимание, что здесь мы должны использовать JSON.stringify перед сохранением данных. Точно так же, как нам нужно было проанализировать данные в формате JSON, когда мы читаем файл JSON, нам нужно «преобразовать» данные в строку, прежде чем мы сможем сохранить их в строковой форме в файле.
Выполните приведенный выше код и откройте файл student-2.json. Вы должны увидеть в файле следующее содержимое:
Хотя это данные, которые мы хотели записать, они представлены в виде одной строковой строки, которую нам трудно прочитать. Если вы хотите, чтобы сериализованный JSON был удобочитаемым человеком, измените JSON.Stringify следующим образом:
Здесь мы говорим методу добавить новые строки и пару отступов в сериализованный JSON. Теперь, если вы откроете файл "student-2.json", вы должны увидеть текст в следующем формате.
Использование fs.writeFile
Как я уже говорил ранее, writeFile функция выполняется в асинхронном режиме, что означает наш код не блокируется , а данные записываются в файл. И, как и в случае с асинхронными методами ранее, нам нужно передать обратный вызов этой функции.
Давайте напишем еще один файл, student-3.json, с writeFile функции writeFile.
Результатом вышеупомянутого скрипта будет:
И снова вы можете видеть, что последняя строка нашего кода фактически отображается первой в консоли, поскольку наш обратный вызов еще не был вызван. Это приводит к значительной экономии времени выполнения, если у вас есть большие объемы данных для записи в файл или если у вас довольно много файлов для записи.
How to append a JSON file
Node doesn’t have a built-in function for appending or updating fields of an existing JSON file out of the box. You can, however, read the JSON file using the readFile method of the fs module, update it, and overwrite the JSON file with the updated JSON.
Below is a code snippet illustrating how to go about it:
Synchronous API
The synchronous methods of the built-in fs module block the event loop and further execution of the remaining code until the operation has succeeded or failed. More often than not, blocking the event loop is not something you want to do.
The names of all synchronous functions end with the Sync characters. For example, writeFileSync and readFileSync are both synchronous functions.
You can access the synchronous API by requiring fs :
Update a JSON file
Now that we are able to read and write our customer files, we can use them as a simple kind of database. If we want to update the data in the JSON file, we can read the contents, change the data, and then write the new data back to the file:
Definitely not the most efficient database you could choose, but working with JSON files like this is a simple way to persist data in your project.
Prerequisites
How to write to JSON files in Node.js
Just like reading JSON files, the fs module provides built-in methods for writing to JSON files.
You can use the writeFile and writeFileSync methods of the fs module. The difference between the two is that writeFile is asynchronous while writeFileSync is synchronous. Before writing a JSON file, make sure to serialize the JavaScript object to a JSON string using the JSON.stringify method.
Here is an example of writing a JSON file with fs.writeFile
And that’s it! Once the callback runs, the file has been written to disk. Note: we are only passed an error object; the filedata we wrote isn’t passed to the callback. We can also write a file synchronously in the same way using fs.writeFileSync :
After your file is finished writing, it will look something like this:
Stringifying by default puts your data all on a single line. Optionally, you can make the output file human-readable by passing the number of spaces to indent by to JSON.stringify :
Above, we told stringify to indent the data with 2 spaces. Now your output file should look like this:
How to read and write to JSON files using third-party npm packages
In this section, we shall look at the most popular third-party Node packages for reading and writing data in JSON format.
31 Answers 31
You can simply use JSON.parse .
The definition of the JSON object is part of the ECMAScript 5 specification. node.js is built on Google Chrome's V8 engine, which adheres to ECMA standard. Therefore, node.js also has a global object JSON [docs] .
Note - JSON.parse can tie up the current thread because it is a synchronous method. So if you are planning to parse big JSON objects use a streaming json parser.
@snapfractalpop: The documentation only describes functions, etc, which are part of node.js. Standard JavaScript features are part of V8, node.js is built on. I updated the answer accordingly.
here, i've published a demo where you can see and play with this answer online (the parsing example is in app.js file - then click on the run button and see the result in the terminal): link you can modify the code and see the impact.
Your answer requires prior knowledge of JavaScript syntax. How hard would it be to show a usage example? JSON.parse(str); // is the noob-friendly and therefore better answer
you can require .json files.
For example if you have a config.json file in the same directory as your source code file you would use:
or (file extension can be omitted):
note that require is synchronous and only reads the file once, following calls return the result from cache
Also note You should only use this for local files under your absolute control, as it potentially executes any code within the file.
If you are using this method to parse the file make sure to take the path into account for the require. For example, you might need to do something like this: require './file-name-with-no-extension' (for example if the file is in the current directory)
Note that the response is cached. E.g. if you put above require call in a function, call the function, change the JSON file, and call the function again, you'll get the old version of the JSON file. Has caught me out a couple of times!
Note also that require is synchronous. If you want to async friendly use fs.readFile instead with JSON.parse
Will this approach just treat the file as JavaScript, thus potentially running arbitrary code in the .json file?
Simple note: don't forget to use the .json extension! If your file does NOT have the .json extension, require will not treat it as a json file.
You can use JSON.parse() .
You should be able to use the JSON object on any ECMAScript 5 compatible JavaScript implementation. And V8, upon which Node.js is built is one of them.
You'll have to do some file operations with fs module.
Читайте также: